Enclave
Self-hosted / privacy

Your conversations live only on your own machine

Enclave is open-source under MIT and runs on your own server or home box with a single docker compose command. Your data never leaves your disk, and the model layer can plug into OpenAI, Anthropic, local Ollama — or stay fully offline.

The cost of handing your private conversations to the cloud

Mainstream AI chat products store your conversations in the cloud and feed them into future training. Even when the ToS says "we won't," you have no way to verify it yourself. The moment your account gets locked, the platform shuts down, or compliance rules shift, those chats are no longer in your hands.

The entire stack is open-source — running it yourself is the strongest privacy promise there is

The entire Enclave monorepo lives on GitHub under MIT — audit it, modify it, fork it. A single docker compose spins up your own instance with API, frontend, database, and vector index all running locally. The model layer mixes cloud APIs and local Ollama / vLLM however you want — you decide which model handles which kind of conversation.

How it works in practice

  • Fully open-source code

    MIT-licensed with no binary black boxes — every "we won't touch your data" promise is something you can audit yourself.

  • Three-minute Docker deployment

    clone → cp .env → docker compose up. The README spells out the whole flow up top — anyone can spin it up.

  • Fully swappable models

    Plug in OpenAI, Anthropic, Google, DeepSeek, or local Ollama / vLLM — different characters can even run on different models.

  • Exportable and deletable data

    Export the whole database as JSON, migrate to another machine, or wipe everything — all one-click. No "copies that can't be deleted."

FAQ

  • How hard is self-hosting?
    If you've used docker compose, it's about as hard as running any other web service. The README's "three-minute deploy" flow is real — clone, edit .env, start the services. Three steps.
  • Can it run fully offline?
    Yes. Swap the model layer for local Ollama or vLLM, turn off real-world sync, and the entire system stops making any outbound request.
  • Can a self-hosted instance still get the official feature updates?
    Run git pull + docker compose up -d to roll forward to the latest version. The CHANGELOG flags breaking changes for every release.

Ready to try it?

Open it in your browser — no credit card, no install.

See other use cases

Your conversations live only on your own machine · Enclave