OpenCode Proxy is a small local bridge that lets Anthropic-compatible clients talk to OpenCode Go or OpenCode Zen.
It accepts Anthropic-style requests on POST /v1/messages, converts them to OpenAI-style chat completion requests, sends them to OpenCode Go, and converts the response back to the Anthropic shape expected by tools such as Claude Code and Claw.
- Serves a local Anthropic-compatible API on
127.0.0.1:11434. - Translates
/v1/messagesrequests into/v1/chat/completions. - Supports streaming and non-streaming responses.
- Converts text, system prompts, image blocks, tool calls, and tool results between API formats.
- Exposes
/healthfor quick checks. - Exposes
/v1/modelswith OpenCode Go models and Claude-style aliases. - Reads model aliases from
models.jsonand reloads them when the file changes. - Keeps the real OpenCode API key in
OPENCODE_API_KEY, outside the repo.
OpenCode Go uses the OpenAI chat completions format. Some coding clients expect Anthropic's messages format instead. This proxy sits between them:
Claude Code / Claw -> localhost:11434 -> OpenCode Proxy -> OpenCode Go / Zen
The clients are only redirected to a local Anthropic-compatible URL. Nothing else needs to be faked.
For example:
- Claude Code can use
ANTHROPIC_BASE_URL=http://127.0.0.1:11434insettings.json. - Claw can use a wrapper that exports the same
ANTHROPIC_BASE_URLandANTHROPIC_API_KEY=sk-dummy.
After that, clients keep sending their normal model names:
- Claude Code may send
claude-sonnet-4-20250514. - Claw sends whichever Anthropic model name it selected.
The proxy does the mapping by itself:
claude-sonnet-4-20250514 -> models.json -> deepseek-v4-pro
OpenCode Go never sees the original Claude model name. It receives a normal OpenAI-format request with the mapped OpenCode Go model.
In short:
- Clients are told that the Anthropic API is running on localhost.
- The proxy translates Anthropic requests into OpenAI requests.
- The proxy maps Claude-style model names to OpenCode Go model names.
- No other client-side patching is needed.
- Node.js 18 or newer.
- An OpenCode API key in
OPENCODE_API_KEY. - Subscription tier in
OPENCODE_TIER:go(default) orzen.
git clone https://github.com/bigdata2211it-web/opencode-proxy.git
cd opencode-proxy
cp .env.example .env
# Edit .env and set OPENCODE_API_KEY.
export OPENCODE_API_KEY=<your-opencode-key>
# Optional: switch from Go to Zen subscription (default: go)
export OPENCODE_TIER=zen
node index.jsThe proxy starts on http://127.0.0.1:11434 by default.
To use another port:
node index.js 11435curl http://127.0.0.1:11434/healthPoint Anthropic-compatible clients at the local proxy:
export ANTHROPIC_BASE_URL=http://127.0.0.1:11434
export ANTHROPIC_API_KEY=sk-dummyFor Claude-style model names, use aliases such as:
export ANTHROPIC_DEFAULT_SONNET_MODEL=claude-sonnet-4-20250514
export ANTHROPIC_DEFAULT_OPUS_MODEL=claude-opus-4-20250514
export ANTHROPIC_DEFAULT_HAIKU_MODEL=claude-haiku-4-20250514Edit models.json to choose which OpenCode Go model each Claude family should use:
{
"opus": "qwen3.6-plus",
"sonnet": "deepseek-v4-pro",
"haiku": "mimo-v2.5-pro"
}The proxy expands these aliases automatically. For example, sonnet, claude-sonnet-4, and claude-sonnet-4-20250514 can all map to the same target model.
Available OpenCode Go models currently listed by the proxy:
glm-5, glm-5.1, kimi-k2.5, kimi-k2.6, minimax-m2.5, minimax-m2.7,
deepseek-v4-flash, deepseek-v4-pro, qwen3.5-plus, qwen3.6-plus,
mimo-v2-pro, mimo-v2-omni, mimo-v2.5, mimo-v2.5-pro
HEAD /andHEAD /v1- connection checks.GET /health- proxy status.GET /v1/models- available models and aliases.POST /v1/messages- Anthropic-compatible messages endpoint.
Create a local .env from .env.example, or provide the variables another way:
OPENCODE_API_KEY=<your-opencode-key>
OPENCODE_TIER=go # or: zenDo not commit .env; it is ignored by git.
| Tier | OPENCODE_TIER |
Endpoint | Pricing |
|---|---|---|---|
| Go | go (default) |
https://opencode.ai/zen/go/v1/chat/completions |
$5 first month, then $10/month (flat) |
| Zen | zen |
https://opencode.ai/zen/v1/chat/completions |
Pay-as-you-go, no limits |
Both tiers use the same API key and the same set of open models (Qwen, GLM, Kimi, MiniMax, DeepSeek, MiMo). Zen additionally provides Claude, GPT, Gemini, and several free models.
To switch tiers, change OPENCODE_TIER and restart the proxy.
For free AI tools, news, and project updates, subscribe to the Telegram channel:
For direct questions or feedback, message:
This project is intentionally small: one Node.js entrypoint, one model mapping file, and no external runtime dependencies.
Supports both OpenCode Go (flat subscription) and OpenCode Zen (pay-as-you-go) via OPENCODE_TIER.