OTA (Over-The-Air) Localization

Update translations in real-time without redeploying your application. LRM is the first and only OTA localization solution for .NET!

Why OTA?

Traditional localization requires a full deployment cycle to fix a typo or add a new language. With OTA, your app fetches translations from LRM Cloud at runtime, allowing instant updates without touching your codebase.

How It Works

💻
App starts with local resources
🔄
Background service syncs
☁️
Fetches from LRM Cloud
✔️
Live updates applied
  1. Immediate start — Your app launches using embedded or local resources (no network delay)
  2. Background sync — A background service fetches the latest translations from LRM Cloud
  3. ETag caching — Efficient bandwidth usage; only downloads when content changes
  4. Auto-refresh — Syncs automatically every 5 minutes (configurable)
  5. Graceful fallback — Uses local resources if cloud is unreachable

Benefits

Use Case Without OTA With OTA
Fix a typo Code change, build, test, deploy Edit in LRM Cloud, instant fix
Add new language Add files, rebuild, redeploy Translate in cloud, auto-sync
Emergency update Hotfix deployment required Update immediately via web UI
A/B test copy Feature flags + deployment Change translations live

.NET SDK Available

The .NET SDK provides full OTA support through the LocalizationManager.JsonLocalization NuGet package.

Full Library Documentation

For complete documentation on the JsonLocalization library including standalone usage, ASP.NET Core integration, pluralization, and source generators, see the .NET Libraries page.

Installation

dotnet add package LocalizationManager.JsonLocalization

Quick Setup

// Program.cs
builder.Services.AddJsonLocalizationWithOta(options =>
{
    options.UseOta(
        endpoint: "https://lrm-cloud.com",
        apiKey: "lrm_your_read_only_api_key",
        project: "@username/my-project"  // or "org/project"
    );

    // Optional: Configure refresh interval (default: 5 minutes)
    options.Ota!.RefreshInterval = TimeSpan.FromMinutes(5);

    // Optional: Fall back to local resources when offline
    options.FallbackToLocal = true;
    options.ResourcesPath = "Resources";  // Local fallback path
});

Creating an API Key

  1. Go to Project SettingsAPI Keys in LRM Cloud
  2. Click "Create API Key"
  3. Select "Read" scope (sufficient for OTA)
  4. Copy the key (starts with lrm_)

Configuration Options

Option Default Description
Endpoint https://lrm-cloud.com LRM Cloud API endpoint
ApiKey API key with read scope (required)
Project Project path: @user/project or org/project
RefreshInterval 5 minutes How often to check for updates
FallbackToLocal true Use local resources when offline
Timeout 10 seconds HTTP request timeout
MaxRetries 3 Retry attempts for failed requests

Supported Platforms

OTA works with the entire .NET ecosystem:

  • ASP.NET Core (Web APIs, MVC, Razor Pages)
  • Blazor (Server + WebAssembly)
  • .NET MAUI (iOS, Android, Windows, macOS)
  • Avalonia (Cross-platform desktop)
  • WPF and WinForms
  • Console applications
  • Azure Functions / AWS Lambda
  • Worker Services

Network Resilience

The OTA client includes built-in resilience features:

  • Retry with exponential backoff — Automatically retries failed requests
  • Circuit breaker — Stops requests after repeated failures, auto-recovers
  • ETag caching — Efficient bandwidth usage, only fetches when changed
  • Graceful fallback — Uses local resources when cloud is unavailable

Source Generator Compatibility

When using OTA with the LocalizationManager.JsonLocalization.Generator package:

  • Generated classes work for compile-time keys
  • New OTA keys use dynamic access: Strings.Localizer["NewKey"]

Coming Soon Future

We're working on OTA SDKs for additional platforms:

SDK Status Description
Python SDK Planned OTA for Django, Flask, FastAPI applications
Node.js SDK Planned OTA for Express, Next.js, React applications
Swift SDK Planned OTA for iOS and macOS native apps
Kotlin SDK Planned OTA for Android native apps
Want an SDK for your platform?

Let us know! Open an issue on GitHub to request an SDK for your platform.

API Format

The OTA endpoint returns a JSON bundle containing all translations for a project:

Endpoint

GET /api/ota/{owner}/{project}/bundle
X-API-Key: lrm_your_api_key

Response Format

{
  "version": "2025-01-15T10:30:00.000Z",
  "project": "@username/my-project",
  "defaultLanguage": "en",
  "languages": ["en", "fr", "de", "es"],
  "deleted": [],
  "translations": {
    "en": {
      "Welcome": "Welcome to our app!",
      "Greeting": "Hello, {0}!",
      "Items": {
        "one": "{0} item",
        "other": "{0} items"
      }
    },
    "fr": {
      "Welcome": "Bienvenue dans notre application!",
      "Greeting": "Bonjour, {0}!",
      "Items": {
        "one": "{0} article",
        "other": "{0} articles"
      }
    }
  }
}

ETag Caching

The API supports ETag-based caching for efficient polling:

// Initial request
GET /api/ota/@user/project/bundle
Response: 200 OK
ETag: "abc123"

// Subsequent request with If-None-Match
GET /api/ota/@user/project/bundle
If-None-Match: "abc123"
Response: 304 Not Modified (no body, bandwidth saved)

Sample Projects

Sample Description
ConsoleApp.OtaDemo Complete OTA demo with mock HTTP handler (no server required). Demonstrates bundle fetching, ETag caching, culture switching, pluralization, live updates, and fallback.

Running the Demo

# Clone the repository
git clone https://github.com/nickprotop/LocalizationManager.git
cd LocalizationManager

# Run the OTA demo
cd samples/ConsoleApp.OtaDemo
dotnet run

The demo uses a MockOtaHandler that simulates the LRM Cloud API, so you can explore all OTA features without needing a real server connection.