Skip to main content
MIRA’s provider system is designed for extension. Adding a new provider requires changes to both Python engine files and the TypeScript settings UI.

1. Add the provider constant

In src/shared/constants.ts, add the provider to both engine provider lists:
export const RLM_PROVIDERS = ['bedrock', 'anthropic', 'openai', 'ollama', 'my-provider'] as const
export const NAE_PROVIDERS = ['openai', 'anthropic', 'ollama', 'bedrock', 'my-provider'] as const
Also add suggested models:
export const RLM_PROVIDER_MODELS: Record<string, { id: string; label: string }[]> = {
  // ...existing providers
  'my-provider': [{ id: 'my-model-v1', label: 'My Model v1' }],
}

2. Add credentials to the credential store

MIRA uses Electron’s safeStorage API to store secrets encrypted in credentials.json. API tokens are stored as named entries via the credentials:save-token IPC channel. No changes to src/main/credential-store.ts are needed for a standard API-key provider — the user just adds a new API token entry via Settings → API Tokens using the env var name your engine expects. If your provider needs OAuth or a non-standard credential shape, extend CredentialStore in src/main/credential-store.ts.

3. Implement the NAE Python client

The NAE uses a single LLMClient class in resources/nae_context/llm_client.py. Add a new branch inside the relevant methods (_headers, _chat_payload, stream, complete) for your provider:
# resources/nae_context/llm_client.py

def _base_url(self) -> str:
    if self._config.base_url:
        return self._config.base_url.rstrip("/")
    if self._provider == "my-provider":
        return "https://api.my-provider.com/v1"
    # ...existing providers
    return _OPENAI_BASE

def _headers(self) -> dict:
    hdrs = {"Content-Type": "application/json"}
    if self._provider == "my-provider":
        hdrs["Authorization"] = f"Bearer {os.environ.get('MY_PROVIDER_API_KEY', '')}"
    # ...existing providers
    return hdrs
Also update NAEConfig in resources/nae_context/nae_types.py to document the new provider name:
@dataclass
class NAEConfig:
    provider: str = "openai"   # openai | anthropic | ollama | bedrock | my-provider

4. Implement the RLM Python client (if needed)

For RLM, the provider is handled via boto3 (Bedrock) or LangChain-compatible adapters in resources/rlm_context/. If your provider supports an OpenAI-compatible API endpoint, you may be able to configure it via the base_url override without code changes.

5. Add Python dependencies

Add the provider’s SDK to resources/requirements.txt:
my-provider-sdk>=1.0.0

6. Add the settings UI component

Settings components live in src/renderer/src/components/settings/. Create a new component following the pattern of AwsCredentials.tsx or ApiTokens.tsx:
// src/renderer/src/components/settings/MyProviderSettings.tsx
export function MyProviderSettings() {
  // - Masked API key input using window.api.invoke('credentials:save-token', ...)
  // - Model selector populated from RLM_PROVIDER_MODELS constant
  // - Save / Remove buttons
}
Register the component in EngineSettings.tsx or SettingsModal.tsx depending on whether it’s an engine-level or global provider setting.

7. Write tests

Add provider tests in resources/nae_context/ (manual integration test) and cover the settings component with a renderer test once TypeScript testing is set up.

8. Update docs

Add a new page at docs/configuration/providers/my-provider.mdx following the pattern of the existing provider pages, and add it to the Providers navigation group in docs/docs.json.
Edit this page — Open a pull request