Models
Configure which AI model your agent uses, enable runtime model switching for end users, and connect to hundreds of models via OpenRouter.
Setting a model
Pass a model ID to the model field in your agent config. The default is claude-sonnet-4-6.
import { agent } from "@21st-sdk/agent"
export default agent({
model: "claude-sonnet-4-6",
systemPrompt: "You are a helpful assistant.",
})Native Anthropic models
These models work out of the box with the Claude Code runtime. No extra configuration needed.
| Model | ID | Best for |
|---|---|---|
| Claude Opus 4.6 | claude-opus-4-6 | Top-tier reasoning, complex multi-step tasks |
| Claude Sonnet 4.6 | claude-sonnet-4-6 | Best balance of speed, cost, and intelligence |
| Claude Haiku 4.5 | claude-haiku-4-5-20251001 | Fastest and cheapest, great for simple tasks |
Model switching
Let end users switch between models at runtime without redeploying. Pass an allowedModels array with two or more model IDs. The first entry in model is the default; the rest appear in the model selector.
import { agent } from "@21st-sdk/agent"
export default agent({
model: "claude-sonnet-4-6",
allowedModels: ["claude-sonnet-4-6", "claude-opus-4-6", "claude-haiku-4-5-20251001"],
systemPrompt: "You are a helpful assistant.",
})When allowedModels has more than one entry, a model selector appears in the chat UI. The user's choice overrides the default model value for that session.
allowedModels array — for example, offer DeepSeek for cost and Claude Opus for quality in one agent.Using OpenRouter models
Through OpenRouter, you can use hundreds of models from Google, Meta, DeepSeek, Alibaba, OpenAI, and others. Pass the OpenRouter model ID directly in your agent config:
import { agent } from "@21st-sdk/agent"
export default agent({
model: "deepseek/deepseek-v3.2",
systemPrompt: "You are a helpful assistant.",
})OpenRouter handles authentication, billing, and failover across providers. Browse all available models and their IDs at openrouter.ai/models.
You can combine OpenRouter and native models in allowedModels to give users a choice:
import { agent } from "@21st-sdk/agent"
export default agent({
model: "deepseek/deepseek-v3.2",
allowedModels: ["deepseek/deepseek-v3.2", "google/gemini-3.1-pro", "claude-sonnet-4-6"],
systemPrompt: "You are a helpful assistant.",
})Popular OpenRouter models
| Provider | Model | ID | Context |
|---|---|---|---|
| Gemini 3.1 Pro | google/gemini-3.1-pro | 1M | |
| Gemini 3.0 Flash | google/gemini-3.0-flash | 1M | |
| DeepSeek | DeepSeek V3.2 | deepseek/deepseek-v3.2 | 128K |
| DeepSeek | DeepSeek R1 | deepseek/deepseek-r1 | 128K |
| Meta | Llama 4 Maverick | meta-llama/llama-4-maverick | 256K |
| Meta | Llama 4 Scout | meta-llama/llama-4-scout | 10M |
| Alibaba | Qwen 3.5 Max | qwen/qwen-3.5-max | 128K |
| OpenAI | GPT-5.4 | openai/gpt-5.4 | 128K |
For the full list of 200+ models with live pricing, see openrouter.ai/models.
Choosing the right model
Complex agentic tasks — Claude Opus 4.6 or GPT-5.4. Best for multi-step reasoning and tool use where accuracy matters most.
Production workloads — Claude Sonnet 4.6 or Gemini 3.1 Pro. Best price-to-performance. Sonnet is the default.
Cost-sensitive tasks — DeepSeek V3.2 delivers near-frontier performance at ~10x lower cost. Qwen 3.5 and Llama 4 are strong alternatives.
Massive context — Gemini 3.1 Pro (1M tokens) or Llama 4 Scout (10M tokens) for entire codebases or long documents.
Quick simple tasks — Claude Haiku 4.5 is the fastest and cheapest while still being highly capable.