Mozilla's MZLA Technologies subsidiary shipped Thunderbolt on Thursday, an open-source AI client that wants to live on your own servers instead of Microsoft's. Or OpenAI's. Or Anthropic's. Under the hood it wires into Berlin-based deepset's Haystack for agent orchestration and retrieval, then exposes Model Context Protocol and Agent Client Protocol hooks so whichever model you already run can slot in. MZLA CEO Ryan Sipes told The Register the target is companies that refuse to let internal data leave their network. The GitHub page is blunt about where the product sits today. Pre-audit. Pre-production.
Key Takeaways
- Mozilla subsidiary MZLA shipped Thunderbolt on April 16, an open-source AI client enterprises self-host instead of renting Copilot or ChatGPT Enterprise.
- The client rides on Berlin-based deepset's Haystack for RAG and agent orchestration, and speaks MCP and ACP so any compatible model plugs in.
- Mozilla shipped pre-audit, pre-production code under MPL 2.0, with native apps for Windows, macOS, Linux, iOS, Android, and the web.
- CEO Ryan Sipes is framing it as a Firefox-versus-Explorer moment, but regulated buyers will not sign until the security audit closes.
AI-generated summary, reviewed by an editor. More on our AI guidelines.
A client, not a model
Thunderbolt ships no inference of its own. It is a front end, deliberately so, with LLM calls passing through a backend proxy that connects to Anthropic, OpenAI, Mistral, OpenRouter, or any OpenAI-compatible endpoint. Local inference runs through Ollama and llama.cpp. The repository lists Chat Mode and Search Mode as available today, with Research Mode, Tasks, and full MCP support marked in preview.
That split matters. Mozilla is treating the interface, the orchestration layer, and the model as three pieces a buyer can swap in and out. Mix a Haystack-managed RAG pipeline with a local Llama deployment, drop a Thunderbolt workspace on top, and park the data in an offline SQLite file Mozilla calls a local "source of truth." No cloud round-trip. Sipes told The Register the whole thing can run on a single box when compliance gets loud enough.
The native apps cover the full surface. Windows, macOS, Linux. iOS and Android too, plus the web. License is MPL 2.0.
Why Berlin matters
The Haystack partnership is the load-bearing technical choice here. deepset, the Berlin shop behind Haystack, already sells into government agencies and aerospace manufacturers. It also sells into multinationals whose legal teams have spent two years refusing to sign off on a ChatGPT Enterprise rollout. Haystack hands Thunderbolt a production-grade RAG backbone on day one. Not a stub to be built later.
It is also a political choice. Haystack is one of the tools Germany placed in its sovereign D-Stack, the national software list Berlin is standardizing on for public administration. Mozilla landing on a German vendor for its sovereignty pitch is not accidental. Public sector buyers in the EU have been the loudest voice on data residency, and they are exactly the procurement cycle MZLA wants to catch.
"Organizations are looking for a complete sovereign AI stack, paired with the expertise to deliver it," Milos Rusic, deepset's co-founder and CEO, said in the joint announcement. deepset embeds forward-deployed engineers inside client environments. That is the enterprise services revenue Mozilla is targeting.
Enterprise AI strategy, decoded for operators
Strategic AI news from San Francisco. No hype, no "AI will change everything" throat clearing. Just what moved, who won, and why it matters. Daily at 6am PST.
No spam. Unsubscribe anytime.
Not production-ready, and Mozilla says so
The caveats are unusually plain for a launch day. The GitHub README states Thunderbolt is "under active development, currently undergoing a security audit, and preparing for enterprise production readiness." Gigazine reported that the app still relies on authentication and search services rather than running fully offline, which is the stated long-term goal. Telemetry is on by default with an opt-out.
Then there is the name. Intel owns the Thunderbolt trademark, Apple markets it heavily, and the Linux press spent launch day flagging the same confusion. Is this a rename of Thunderbird? MZLA has not addressed the collision. OMG Ubuntu pointed readers toward the wait list "before its inevitable name change." A security audit is the easy problem to solve here.
The Firefox-versus-Internet-Explorer pitch
Sipes is leaning on a very specific piece of Mozilla mythology. "Think about Internet Explorer's 95% market share before Firefox came onto the market," he told The Register. "We, collectively, beyond just Mozilla, have to create alternatives to Copilot and ChatGPT so that the future of AI isn't just us renting it from a few gigantic companies."
It is the pitch Mozilla needs to be making. The organization has looked institutionally adrift on AI for three years. Firefox shipped an AI Controls panel in version 148 so users could switch off features Mozilla had just added. The foundation announced in late 2025 that it wanted to "do for AI what we did for the web" without a concrete product to point at. Thunderbolt is the first thing that looks like one.
Whether it matters depends on two variables MZLA does not control. First, the security audit. Regulated buyers will not sign until it closes. Second, whether MZLA can staff the forward-deployed engineering operation its own announcement describes. Haystack handles the retrieval layer. Someone still has to sit inside a bank's network and wire it up.
For now, Thunderbolt is on GitHub, a waitlist is taking enterprise signups at thunderbolt.io, and MZLA is selling professional services to anyone willing to deploy a pre-audit client. You can own the stack today. You cannot quite trust it yet.
Frequently Asked Questions
What is Mozilla Thunderbolt?
Thunderbolt is an open-source AI client built by MZLA Technologies, the Mozilla subsidiary behind Thunderbird. It is a front-end workspace for enterprises that want to host AI infrastructure on their own servers instead of sending data to OpenAI, Microsoft, or Anthropic. The client handles chat, search, and research while the model and orchestration layer stay under the customer's control.
How does Thunderbolt work with deepset's Haystack?
Thunderbolt integrates natively with Haystack, the open-source AI framework from Berlin-based deepset. Haystack handles retrieval-augmented generation, agent orchestration, and vector storage. Thunderbolt provides the user-facing workspace on top. Together they form a sovereign stack where the client, orchestration, and model layers can all run inside an organization's own infrastructure.
Which AI models does Thunderbolt support?
LLM calls pass through a backend proxy that supports Anthropic, OpenAI, Mistral, OpenRouter, and any OpenAI-compatible endpoint. Local inference runs through Ollama and llama.cpp. The client is model-agnostic by design, which lets enterprises mix commercial frontier models with self-hosted open-source models depending on sensitivity and cost.
Is Thunderbolt ready for production use?
Not yet. The GitHub repository states the project is under active development, currently undergoing a security audit, and preparing for enterprise production readiness. Research Mode, Tasks, and full MCP support are marked as preview. Regulated buyers in finance, healthcare, and government should wait for the audit results before committing to a deployment.
What does Thunderbolt cost?
The source code is free under MPL 2.0 and enterprises can self-host it at no charge. MZLA plans to monetize through enterprise services, including paid deployment support, custom integrations, and a managed hosted version currently in development for smaller teams. Pricing reflects support level, customization, and deployment complexity.
AI-generated summary, reviewed by an editor. More on our AI guidelines.



IMPLICATOR