OpenCode — Offline-First AI Coding Agent
Your privacy-first AI pair programmer — runs entirely offline with Ollama
OpenCode is an AI-powered coding agent built for developers who value privacy, control, and offline capability. This fork specifically prioritizes 100% offline operation via Ollama — no cloud dependencies, no API keys, no data leaving your machine.
- Fully Offline — Run code generation, analysis, and refactoring entirely on your local machine with Ollama
- Open Source — MIT licensed, transparent, community-driven
- Multi-Provider — Use Ollama, Claude, OpenAI, Google, or any LLM provider
- Terminal-First — Built by terminal enthusiasts for maximum productivity
- Desktop Beta — Native apps for macOS, Windows, and Linux
This fork was rebuilt and enhanced for robust offline-first usage. Perfect for air-gapped environments, enterprise security requirements, or developers who prefer complete data sovereignty.
# One-liner (recommended)
curl -fsSL https://opencode.ai/install | bash
# npm / bun / pnpm / yarn
npm i -g opencode-ai@latest
# or: bun i -g opencode-ai@latest
# or: pnpm i -g opencode-ai@latest
# or: yarn global add opencode-ai@latest
# Homebrew (macOS/Linux)
brew install anomalyco/tap/opencode
# Scoop (Windows)
scoop install opencode
# Choco (Windows)
choco install opencode
# Nix
nix run nixpkgs#opencode
# mise
mise use -g opencode
# Arch Linux
sudo pacman -S opencode
paru -S opencode-binTip
Remove versions older than 0.1.x before installing.
This fork is optimized for fully offline operation. Follow these steps to get started:
# macOS / Linux
curl -fsSL https://ollama.com/install | bash
# Windows (via winget)
winget install Ollama.Ollama
# Or download directly from https://ollama.com/download# Recommended models for coding
ollama pull codellama # General coding, fast
ollama pull deepseek-coder # Excellent for code generation
ollama pull qwen2.5-coder # Good balance of speed/quality
ollama pull llama3.1 # General purpose, larger
ollama pull mistral # Fast, good for simple tasks# Option 1: Environment variables
export OPENCODE_PROVIDER=ollama
export OPENCODE_MODEL=codellama
export OLLAMA_HOST=localhost:11434
# Option 2: Config file (~/.opencode/config.json)
cat >> ~/.opencode/config.json << 'EOF'
{
"provider": "ollama",
"model": "codellama"
}
EOFopencode --version
opencode . --provider ollama "Write a hello world in Rust"
# This runs entirely locally — no network required!# Start a coding session in your project
opencode .
# Or run a specific task
opencode . "Add user authentication to the login page"The build agent has full access to:
- Read/write any file
- Run shell commands
- Execute git operations
- Use LSP for code intelligence
# Switch to plan agent for safe exploration
opencode . --agent planThe plan agent is ideal for:
- Exploring unfamiliar codebases
- Planning refactoring work
- Security audits (no file modifications)
- Code reviews
# Use @general for complex research tasks
@general Research best practices for implementing WebSocket in Node.js
@general Find all TODO comments and categorize by priority# Use Ollama (offline)
opencode . --provider ollama
# Use Claude (cloud)
opencode . --provider claude
# Use OpenAI (cloud)
opencode . --provider openai| Feature | Description |
|---|---|
| Offline-First | 100% local operation via Ollama — no internet required |
| Multi-Provider | Ollama, Claude, OpenAI, Google Gemini, and more |
| Smart Agents | build (full access), plan (read-only), general (research) |
| TUI-First Design | Terminal interface built for speed and keyboard efficiency |
| LSP Support | Out-of-the-box Language Server Protocol integration |
| VS Code Extension | VS Code SDK integration for seamless editing |
| Desktop Beta | Native apps for macOS, Windows, Linux |
| MCP Support | Model Context Protocol for extensible integrations |
| Client/Server | Headless server mode for remote or UI customization |
| Feature | OpenCode (This Fork) | Original OpenCode | Claude Code | Aider | Continue.dev |
|---|---|---|---|---|---|
| Open Source | ✅ MIT | ✅ MIT | ❌ Closed | ✅ MIT | ✅ MIT |
| Fully Offline | ✅ Ollama | ❌ Cloud only | ✅ Local | ||
| TUI Interface | ✅ | ✅ | ✅ | ✅ | ❌ |
| Multiple Agents | ✅ | ✅ | ✅ | ❌ | ❌ |
| Desktop App | ✅ Beta | ✅ Beta | ✅ | ❌ | ❌ |
| LSP Built-in | ✅ | ✅ | ✅ | ✅ | |
| VS Code Extension | ✅ | ✅ | ✅ | ❌ | ✅ |
| Fork for Offline | ✅ Primary | ❌ | ❌ | ❌ | ❌ |
┌─────────────────────────────────────────────────────────────┐
│ OpenCode │
├─────────────────────────────────────────────────────────────┤
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
│ │ TUI Client │ │ Web Client │ │ Desktop App │ │
│ │ (Terminal) │ │ (Browser) │ │ (Tauri) │ │
│ └──────┬───────┘ └──────┬───────┘ └──────┬───────┘ │
│ │ │ │ │
│ └───────────────────┼───────────────────┘ │
│ │ │
│ ┌────────▼────────┐ │
│ │ API Server │ │
│ │ (Bun/TypeScript) │
│ └────────┬────────┘ │
│ │ │
│ ┌───────────────────┼───────────────────┐ │
│ │ │ │ │
│ ┌──────▼──────┐ ┌────────▼────────┐ ┌──────▼──────┐ │
│ │ Ollama │ │ Claude/OpenAI │ │ LSP │ │
│ │ (Offline) │ │ (Cloud) │ │ Server │ │
│ └─────────────┘ └─────────────────┘ └─────────────┘ │
└─────────────────────────────────────────────────────────────┘
- Client Layer: TUI (terminal), Web UI, Desktop App
- Server Layer: Bun-powered API server handling agent orchestration
- Provider Layer: Pluggable LLM providers (Ollama, Claude, OpenAI, etc.)
- Tool Layer: File operations, shell execution, LSP integration, git operations
OpenCode is also available as a desktop application. Download directly from the releases page or opencode.ai/download.
| Platform | Download |
|---|---|
| macOS (Apple Silicon) | opencode-desktop-darwin-aarch64.dmg |
| macOS (Intel) | opencode-desktop-darwin-x64.dmg |
| Windows | opencode-desktop-windows-x64.exe |
| Linux | .deb, .rpm, or AppImage |
# macOS (Homebrew)
brew install --cask opencode-desktop
# Windows (Scoop)
scoop bucket add extras; scoop install extras/opencode-desktopThe install script respects the following priority order for the installation path:
$OPENCODE_INSTALL_DIR- Custom installation directory$XDG_BIN_DIR- XDG Base Directory Specification compliant path$HOME/bin- Standard user binary directory (if it exists or can be created)$HOME/.opencode/bin- Default fallback
OpenCode includes two built-in agents you can switch between with the Tab key.
- build - Default, full-access agent for development work
- plan - Read-only agent for analysis and code exploration
- Denies file edits by default
- Asks permission before running bash commands
- Ideal for exploring unfamiliar codebases or planning changes
Also included is a general subagent for complex searches and multistep tasks.
This is used internally and can be invoked using @general in messages.
Learn more about agents.
For more info on how to configure OpenCode, head over to our docs.
If you're interested in contributing to OpenCode, please read our contributing docs before submitting a pull request.
# Requirements: Bun 1.3+
git clone https://github.com/koryboyd/opencode.git
cd opencode
bun install
bun dev # Start development serverLooking for ways to contribute?
If you are working on a project that's related to OpenCode and is using "opencode" as part of its name, for example "opencode-dashboard" or "opencode-mobile", please add a note to your README to clarify that it is not built by the OpenCode team and is not affiliated with us in any way.
It's very similar to Claude Code in terms of capability. Here are the key differences:
- 100% open source
- Not coupled to any provider. Although we recommend the models we provide through OpenCode Zen, OpenCode can be used with Claude, OpenAI, Google, or even local models. As models evolve, the gaps between them will close and pricing will drop, so being provider-agnostic is important.
- Out-of-the-box LSP support
- A focus on TUI. OpenCode is built by neovim users and the creators of terminal.shop; we are going to push the limits of what's possible in the terminal.
- A client/server architecture. This, for example, can allow OpenCode to run on your computer while you drive it remotely from a mobile app, meaning that the TUI frontend is just one of the possible clients.
- This fork specifically prioritizes offline-first operation via Ollama — perfect for air-gapped environments, enterprise security, or complete data sovereignty.
- Desktop app stability improvements
- Enhanced Ollama model optimization
- Offline-first documentation expansion
- More local model provider support (llama.cpp, text-generation-webui)
- Improved agent memory/context management
- Additional LSP integrations
Join our community Discord | X.com
- Discord: Join our community
- X/Twitter: @opencode
- Issues: GitHub Issues
- Original Project: anomalyco/opencode — Thanks to the core team for building this amazing tool
- This Fork: koryboyd/opencode — Focused on offline-first, Ollama-powered AI coding
- License: MIT — Free for personal and commercial use
Built with ❤️ for developers who value privacy and control
English | 简体中文 | 繁體中文 | 한국어 | Deutsch | Español | Français | Italiano | Dansk | 日本語 | Polski | Русский | Bosanski | العربية | Norsk | Português (Brasil) | ไทย | Türkçe | Українська | বাংলা | Ελληνικά | Tiếng Việt