Skip to main content
MIRA the desktop app is free and open source (MIT licence). You supply your own AI provider API keys. Provider costs depend on the models you choose — Ollama is completely free (local only), while cloud providers charge per token.
MIRA sends your messages and documents only to the AI provider you have configured. No telemetry, usage analytics, or crash reports are sent anywhere. See Privacy & Data for the full outbound call list.
  • NAE is better for conversational reasoning, long document analysis, and tasks where you want sub-agents to specialise (research, writing, analysis).
  • RLM is better for tasks that benefit from code execution — data processing, calculations, structured data extraction, and iterative refinement.
See Reasoning Engines for a full comparison.
Yes — configure Ollama to run a local model. All reasoning and data stays on your machine.
All data is stored in mira.db (SQLite) in your OS app data directory: - macOS: ~/Library/Application Support/MIRA/mira.db - Windows: %APPDATA%\MIRA\mira.db - Linux: ~/.config/MIRA/mira.db
Export them as JSON files from the Skills or Workflows view (⋮ → Export). You can also copy mira.db directly to back up everything at once.
Yes — export your skills, workflows, and eval profiles as JSON files and import them on each machine. There is no cloud sync — each installation is independent.
PDF, DOCX, TXT, MD, CSV, JSON, and code files (.py, .js, .ts, .rs, .go, .java, .cpp, .c, .yaml, .yml, .toml, .xml, .html, .css). Maximum file size is 50 MB. See Supported Formats.
MIRA is distributed outside the Mac App Store / Microsoft Store. macOS and Windows require you to approve apps from unknown sources. See the macOS installation guide or Windows installation guide for the exact steps.
Absolutely. See the Contributing guide to get started.
Open an issue on GitHub using the bug report or feature request template.
Not currently. Windows ARM support is planned for a future release. Track progress on GitHub Issues.
Edit this page — Open a pull request