Cipher runs powerful language models on your iPhone — or connects to models on your own hardware. No subscriptions. No cloud. No compromises.
Run quantized GGUF models directly on your iPhone. Your conversations never touch a server. No account required, no usage tracking, no API keys.
When you need more power, connect Cipher to models running on your own Mac or server via LMBridge. Get frontier-scale inference without paying per token.
Give your AI real capabilities with tool calling. Built-in tools run on-device, while MCP servers unlock terminal access, databases, Docker, web browsing, and more.
docker ps --format tableDisplays reasoning traces from DeepSeek R1 and similar models. Watch the model think before it answers.
Real-time token-by-token output. No waiting for the full response — start reading as the model writes.
Background the app, come back later — Cipher reconnects to your LMBridge server automatically.
Tune temperature, token limits, system prompts, and context window per conversation.
On-device models work with no internet connection. LMBridge adds remote power when you want it.
No usage analytics, no conversation logging, no accounts. What you ask stays with you.
The Cipher MCP catalog gives your models the ability to act — not just respond. Add any server in seconds from the built-in store.
LMBridge is the relay that connects Cipher on your phone to large models running on your own Mac or server — securely, over WebSocket.
Pair with a 6-digit PIN. Model weights stay on your machine. The relay only passes tokens — nothing is stored.
Download Cipher and start running AI on your own terms today.