LobeChat – Modern Open-Source AI Chat Framework
What is LobeChat?
LobeChat is an open-source, multi-model AI chat framework (WebUI) that lets you deploy your own personal ChatGPT-like app. It supports major language models like GPT‑4, Claude 3, Gemini, Mistral, LLaMA, Ollama, DeepSeek, and more
Why Choose LobeChat?
- Multi-AI platform: Access a variety of AI models (OpenAI, Anthropic, Google Gemini, domestic providers, Ollama, etc.) from a unified interface
- Free, Open-source & Self-hosted: Released under Apache 2.0, you can deploy it yourself using Docker or Vercel — no fees or login required .
- Rich multi-modal support: Includes speech-to-text, text-to-speech, image recognition (gemini vision, GPT‑4‑Vision), file upload, whiteboard, text-to-image, and even video generation with plugins
- Plugin & Agent ecosystem: Offers over 40 plugins (web search, stock analysis, docs) and 500+ agents tailored to domains like coding, research, creative writing
- Premium UI & conversation features: Beautiful UI with light/dark modes, PWA support, Markdown with code blocks/LaTeX/mermaid, visual “chain-of-thought” tracing, and branching conversations .
Core Features
- Multi-cloud LLMs: One-click integration with GPT‑3.5/4, Claude 3, Gemini Pro‑Vision, Mistral, DeepSeek and more
- Vision & voice support: Upload images for visual Q&A. Use TTS/STT to chat via voice
- Artifacts & whiteboarding: Create interactive diagrams, dynamic HTML, and documents directly inside chat
- Knowledge base support: Upload files/audio/video, build your own knowledge store and use retrieval-augmented generation
- Branching conversations: Fork dialogue threads from any message to explore side paths without losing context
How to Install and Use LobeChat
1. Install via Docker (recommended)
bash <(curl -fsSL https://lobe.li/setup.sh) -l zh_CN
docker compose up -d
This script sets up LobeChat with Docker and configures a Chinese locale
2. Or deploy manually
- Clone the repo:
git clone https://github.com/lobehub/lobe-chat.git
- Use Docker Compose or Vercel to deploy
- Supports custom domains, SQLite local DB or PostgreSQL/MySQL remote DB
3. Access the app
- Visit
http://localhost:8080
or your domain
- Register a user (supports next-auth or Clerk for auth)
- Choose your preferred AI model(s) in settings and start chatting
Usage Tips
- Switch AI providers to compare responses or privacy/costs .
- Enable PWA mode to install LobeChat like a native app on desktop or mobile
- Use visual tools: Upload an image for analysis, then ask follow-up questions
- Utilize branching to explore alternate solutions without starting over .
FAQ
Q: Is LobeChat free?
A: Yes. It’s an open-source, self-hosted platform with no usage cost. You pay only for any API usage if you connect external LLM providers .
Q: Which models can I use?
A: Supports OpenAI GPTs, Claude 3, Google Gemini Pro‑Vision, Mistral, DeepSeek, Ollama, domestic Chinese models, and more — swappable on the fly
Q: Can it run locally without internet?
A: Yes—if you use local LLaMA/Ollama models. Otherwise, cloud models require internet .
Q: Is data kept private?
A: Yes—your data stays on your server or local browser, giving full control. No third party tracking .