What it is
opencode is an open-source agentic coding harness — a terminal-native runtime that wraps any LLM with a planning loop, tool calls, file I/O, and shell execution. The defining trait is model-agnostic: opencode doesn't ship with a specific provider lock-in. You point it at whichever frontier model fits your budget and your stack.
That makes it interesting for teams that want a fully open stack, prefer to self-host, or run local/open-weight models like Llama 4, DeepSeek V4, or Qwen.
Why we're tracking it
- Open-source. The harness itself is auditable, forkable, and not gated by a vendor.
- Model-agnostic. Same harness behavior across Claude, GPT, Gemini, and open-weight models — useful for fair benchmarking.
- Terminal-native. Designed for the same shell + file + build loop that Claude Code and Codex evaluate on.
Capabilities at a glance
- Sub-agents: yes —
build(primary),plan(read-only),generalsubagent; mode field per agent. - Skills: yes — SKILL.md with frontmatter, searched in workspace,
~/.opencode, and project dirs. - MCP servers: yes — stdio / HTTP / SSE; prompts and resources surface as commands.
- Hooks: yes — extensive plugin hook system: chat.message, chat.params, tool.execute.before/after, permission.ask, shell.env, session compaction, experimental transforms.
- Slash commands: yes — built-ins (
/init,/review) plus user-defined and skill-derived commands. - Permissions / sandboxing: per-agent rulesets;
buildpermissive,planread-only; file-pattern denials (e.g. .env). - Plugins: yes — first-class
@opencode-ai/pluginSDK with workspace adaptors, auth (OAuth/API), provider hooks, tool registration, OpenTUI components. - Multi-model: 12+ providers — Anthropic, OpenAI, Google/Vertex, GitHub Copilot, Bedrock, Azure, OpenRouter, Mistral, GitLab, OpenCode Zen; custom via plugin ProviderHook.
- Sessions: partial — LLM-summary compaction (goal/progress/blockers/next steps); SQLite persistence; no explicit checkpoint API.
- Surfaces: TUI (OpenTUI + Solid), web (Vite + SolidJS), Electron desktop, IDE extensions;
opencode servefor headless. - Headless / SDK: yes —
createOpencodeServer/createOpencodeClientin@opencode-ai/sdk; HTTP API; usable from Node or browser. - License: MIT; github.com/sst/opencode.
Status in TeamDay
Coming soon. opencode is on the roadmap as a fourth harness option in the Agent Settings dropdown. We'll publish a launch post here when it's wired in. Until then, the live options are Claude Code, Codex, and Gemini CLI.
How it'll fit
When opencode lands, selecting it as an agent's harness will unlock any model the harness supports — including open-weight options that today require a separate self-hosted stack. The MCP server attachment story stays identical: the same media MCP that hosts Seedance 2.0 and gpt-image-2 will work behind opencode agents.
Learn more
- Project site: opencode.ai
- Benchmark reference: Terminal-Bench