Welcome to Chathouse
The open-source AI chat interface. Bring your own keys, connect any model, and start chatting.
Chathouse is an open-source, self-hostable AI chat interface that connects to OpenAI, Anthropic, and Google Gemini through your own API keys. Instead of paying $20/month per provider for capped access, you pay only for what you use - typically a fraction of a cent per message.
Chathouse runs entirely on your infrastructure. Your conversations, API keys, and data never leave your server.
Why Chathouse?
Most people subscribe to ChatGPT, Claude or Gemini separately to use their LLMs. If you want to use all of them - that's at least $60/month in subscriptions to access the models you could query directly via API for pennies. Chathouse gives you a single, polished chat UI across all providers, running on your own hardware, with no middleman.
- Bring your own keys — connect OpenAI, Anthropic, and Google in seconds
- Queue-based processing — messages process in the background, so responses continue even if you close the browser
- Chat sharing — share conversations via unique links
- Two-factor authentication — secure your instance with TOTP-based 2FA
- File uploads — attach files to your messages
- Customizable — system prompts, model selection, chat history retention, and more
Get started
Usage guide
Learn how to chat, share conversations, manage models, and more.
Self-hosting guide
Get Chathouse running on your server in under 5 minutes with Docker Compose.
Configuration reference
All environment variables and settings explained.
How it works
Once deployed, you register your account (registration automatically closes after the first user signs up), add your API keys in Settings > Connections, and start chatting.
Community
Feel free to join our Discord server to chat with the community or report bugs/request features. You can also report issues on our GitHub repository.
And feel free to follow me on X to get latest updates and news about Chathouse :)