Skip to main content
Press ⌘, to open Settings. The left sidebar groups settings by category.

Providers

ProviderSettings page
AWS BedrockAWS Bedrock
OpenAIOpenAI
AnthropicAnthropic (Claude)
OllamaOllama (Local Models)

Engine

Sub-sectionSettings page
Native Agent EngineNAE Settings
RLM EngineRLM Settings

MCP Tools

Manage stdio and SSE MCP servers. See Adding a stdio Server and Adding an SSE Server.

Eval Framework

SettingDefaultDescription
Embedding model(none)Provider + model used for semantic similarity evals
Default LLM judge model(none)Provider + model used as default for LLM judge evals
Run concurrency1Number of eval cases to run in parallel (1–5)
Default pass threshold3 (human review), 0.80 (semantic), — (exact)Per-type default pass thresholds

Data Management

See Reset & Data Management.

Preferences

See Preferences.

Security

See API Tokens & Security.

About

Displays the installed MIRA version, open source licences, and a link to the changelog.
Edit this page — Open a pull request