Skip to content

koryboyd/opencode

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10,098 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

OpenCode logo

OpenCode — Offline-First AI Coding Agent

Your privacy-first AI pair programmer — runs entirely offline with Ollama

Discord Stars Forks License npm Build status Docs


Why OpenCode?

OpenCode is an AI-powered coding agent built for developers who value privacy, control, and offline capability. This fork specifically prioritizes 100% offline operation via Ollama — no cloud dependencies, no API keys, no data leaving your machine.

  • Fully Offline — Run code generation, analysis, and refactoring entirely on your local machine with Ollama
  • Open Source — MIT licensed, transparent, community-driven
  • Multi-Provider — Use Ollama, Claude, OpenAI, Google, or any LLM provider
  • Terminal-First — Built by terminal enthusiasts for maximum productivity
  • Desktop Beta — Native apps for macOS, Windows, and Linux

This fork was rebuilt and enhanced for robust offline-first usage. Perfect for air-gapped environments, enterprise security requirements, or developers who prefer complete data sovereignty.


Quick Install

# One-liner (recommended)
curl -fsSL https://opencode.ai/install | bash

# npm / bun / pnpm / yarn
npm i -g opencode-ai@latest
# or: bun i -g opencode-ai@latest
# or: pnpm i -g opencode-ai@latest
# or: yarn global add opencode-ai@latest

# Homebrew (macOS/Linux)
brew install anomalyco/tap/opencode

# Scoop (Windows)
scoop install opencode

# Choco (Windows)
choco install opencode

# Nix
nix run nixpkgs#opencode

# mise
mise use -g opencode

# Arch Linux
sudo pacman -S opencode
paru -S opencode-bin

Tip

Remove versions older than 0.1.x before installing.


Setting Up Offline Ollama

This fork is optimized for fully offline operation. Follow these steps to get started:

1. Install Ollama

# macOS / Linux
curl -fsSL https://ollama.com/install | bash

# Windows (via winget)
winget install Ollama.Ollama

# Or download directly from https://ollama.com/download

2. Pull a Coding Model

# Recommended models for coding
ollama pull codellama          # General coding, fast
ollama pull deepseek-coder    # Excellent for code generation
ollama pull qwen2.5-coder     # Good balance of speed/quality
ollama pull llama3.1          # General purpose, larger
ollama pull mistral           # Fast, good for simple tasks

3. Configure OpenCode to Use Ollama

# Option 1: Environment variables
export OPENCODE_PROVIDER=ollama
export OPENCODE_MODEL=codellama
export OLLAMA_HOST=localhost:11434

# Option 2: Config file (~/.opencode/config.json)
cat >> ~/.opencode/config.json << 'EOF'
{
  "provider": "ollama",
  "model": "codellama"
}
EOF

4. Verify Offline Mode

opencode --version
opencode . --provider ollama "Write a hello world in Rust"
# This runs entirely locally — no network required!

Usage Examples

Basic Build Agent (Full Access)

# Start a coding session in your project
opencode .

# Or run a specific task
opencode . "Add user authentication to the login page"

The build agent has full access to:

  • Read/write any file
  • Run shell commands
  • Execute git operations
  • Use LSP for code intelligence

Plan Mode (Read-Only Analysis)

# Switch to plan agent for safe exploration
opencode . --agent plan

The plan agent is ideal for:

  • Exploring unfamiliar codebases
  • Planning refactoring work
  • Security audits (no file modifications)
  • Code reviews

General Agent (Research & Multistep Tasks)

# Use @general for complex research tasks
@general Research best practices for implementing WebSocket in Node.js
@general Find all TODO comments and categorize by priority

Switching Between Providers

# Use Ollama (offline)
opencode . --provider ollama

# Use Claude (cloud)
opencode . --provider claude

# Use OpenAI (cloud)
opencode . --provider openai

Features

Feature Description
Offline-First 100% local operation via Ollama — no internet required
Multi-Provider Ollama, Claude, OpenAI, Google Gemini, and more
Smart Agents build (full access), plan (read-only), general (research)
TUI-First Design Terminal interface built for speed and keyboard efficiency
LSP Support Out-of-the-box Language Server Protocol integration
VS Code Extension VS Code SDK integration for seamless editing
Desktop Beta Native apps for macOS, Windows, Linux
MCP Support Model Context Protocol for extensible integrations
Client/Server Headless server mode for remote or UI customization

Comparison with Alternatives

Feature OpenCode (This Fork) Original OpenCode Claude Code Aider Continue.dev
Open Source ✅ MIT ✅ MIT ❌ Closed ✅ MIT ✅ MIT
Fully Offline ✅ Ollama ⚠️ Limited ❌ Cloud only ✅ Local ⚠️ Limited
TUI Interface
Multiple Agents
Desktop App ✅ Beta ✅ Beta
LSP Built-in ⚠️
VS Code Extension
Fork for Offline ✅ Primary

Architecture Overview

┌─────────────────────────────────────────────────────────────┐
│                        OpenCode                              │
├─────────────────────────────────────────────────────────────┤
│  ┌──────────────┐   ┌──────────────┐   ┌──────────────┐    │
│  │  TUI Client  │   │  Web Client  │   │ Desktop App  │    │
│  │  (Terminal)  │   │  (Browser)   │   │   (Tauri)    │    │
│  └──────┬───────┘   └──────┬───────┘   └──────┬───────┘    │
│         │                   │                   │            │
│         └───────────────────┼───────────────────┘            │
│                             │                                │
│                    ┌────────▼────────┐                      │
│                    │   API Server   │                      │
│                    │  (Bun/TypeScript)                       │
│                    └────────┬────────┘                      │
│                             │                                │
│         ┌───────────────────┼───────────────────┐           │
│         │                   │                   │            │
│  ┌──────▼──────┐   ┌────────▼────────┐  ┌──────▼──────┐    │
│  │   Ollama    │   │ Claude/OpenAI   │  │    LSP      │    │
│  │  (Offline)  │   │    (Cloud)      │  │  Server     │    │
│  └─────────────┘   └─────────────────┘  └─────────────┘    │
└─────────────────────────────────────────────────────────────┘
  • Client Layer: TUI (terminal), Web UI, Desktop App
  • Server Layer: Bun-powered API server handling agent orchestration
  • Provider Layer: Pluggable LLM providers (Ollama, Claude, OpenAI, etc.)
  • Tool Layer: File operations, shell execution, LSP integration, git operations

Desktop App (Beta)

OpenCode is also available as a desktop application. Download directly from the releases page or opencode.ai/download.

Platform Download
macOS (Apple Silicon) opencode-desktop-darwin-aarch64.dmg
macOS (Intel) opencode-desktop-darwin-x64.dmg
Windows opencode-desktop-windows-x64.exe
Linux .deb, .rpm, or AppImage
# macOS (Homebrew)
brew install --cask opencode-desktop
# Windows (Scoop)
scoop bucket add extras; scoop install extras/opencode-desktop

Installation Directory

The install script respects the following priority order for the installation path:

  1. $OPENCODE_INSTALL_DIR - Custom installation directory
  2. $XDG_BIN_DIR - XDG Base Directory Specification compliant path
  3. $HOME/bin - Standard user binary directory (if it exists or can be created)
  4. $HOME/.opencode/bin - Default fallback

Agents

OpenCode includes two built-in agents you can switch between with the Tab key.

  • build - Default, full-access agent for development work
  • plan - Read-only agent for analysis and code exploration
    • Denies file edits by default
    • Asks permission before running bash commands
    • Ideal for exploring unfamiliar codebases or planning changes

Also included is a general subagent for complex searches and multistep tasks. This is used internally and can be invoked using @general in messages.

Learn more about agents.


Documentation

For more info on how to configure OpenCode, head over to our docs.


Contributing

If you're interested in contributing to OpenCode, please read our contributing docs before submitting a pull request.

Development Setup

# Requirements: Bun 1.3+
git clone https://github.com/koryboyd/opencode.git
cd opencode
bun install
bun dev                    # Start development server

Looking for ways to contribute?


Building on OpenCode

If you are working on a project that's related to OpenCode and is using "opencode" as part of its name, for example "opencode-dashboard" or "opencode-mobile", please add a note to your README to clarify that it is not built by the OpenCode team and is not affiliated with us in any way.


FAQ

How is this different from Claude Code?

It's very similar to Claude Code in terms of capability. Here are the key differences:

  • 100% open source
  • Not coupled to any provider. Although we recommend the models we provide through OpenCode Zen, OpenCode can be used with Claude, OpenAI, Google, or even local models. As models evolve, the gaps between them will close and pricing will drop, so being provider-agnostic is important.
  • Out-of-the-box LSP support
  • A focus on TUI. OpenCode is built by neovim users and the creators of terminal.shop; we are going to push the limits of what's possible in the terminal.
  • A client/server architecture. This, for example, can allow OpenCode to run on your computer while you drive it remotely from a mobile app, meaning that the TUI frontend is just one of the possible clients.
  • This fork specifically prioritizes offline-first operation via Ollama — perfect for air-gapped environments, enterprise security, or complete data sovereignty.

Roadmap

In Progress

  • Desktop app stability improvements
  • Enhanced Ollama model optimization
  • Offline-first documentation expansion

Planned

  • More local model provider support (llama.cpp, text-generation-webui)
  • Improved agent memory/context management
  • Additional LSP integrations

Community & Links

Join our community Discord | X.com


Credits & License

  • Original Project: anomalyco/opencode — Thanks to the core team for building this amazing tool
  • This Fork: koryboyd/opencode — Focused on offline-first, Ollama-powered AI coding
  • License: MIT — Free for personal and commercial use

Built with ❤️ for developers who value privacy and control


English | 简体中文 | 繁體中文 | 한국어 | Deutsch | Español | Français | Italiano | Dansk | 日本語 | Polski | Русский | Bosanski | العربية | Norsk | Português (Brasil) | ไทย | Türkçe | Українська | বাংলা | Ελληνικά | Tiếng Việt

About

opencode with full offline ollama support

Resources

License

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages

  • TypeScript 53.9%
  • MDX 41.7%
  • CSS 3.3%
  • Rust 0.6%
  • Astro 0.2%
  • JavaScript 0.1%
  • Other 0.2%