Agents SDK

Models

Configure which AI model your agent uses, enable runtime model switching for end users, and connect to hundreds of models via OpenRouter.

Setting a model

Pass a model ID to the model field in your agent config. The default is claude-sonnet-4-6.

agents/my-agent/index.ts
import { agent } from "@21st-sdk/agent"

export default agent({
  model: "claude-sonnet-4-6",
  systemPrompt: "You are a helpful assistant.",
})

Native Anthropic models

These models work out of the box with the Claude Code runtime. No extra configuration needed.

ModelIDBest for
Claude Opus 4.6claude-opus-4-6Top-tier reasoning, complex multi-step tasks
Claude Sonnet 4.6claude-sonnet-4-6Best balance of speed, cost, and intelligence
Claude Haiku 4.5claude-haiku-4-5-20251001Fastest and cheapest, great for simple tasks

Model switching

Let end users switch between models at runtime without redeploying. Pass an allowedModels array with two or more model IDs. The first entry in model is the default; the rest appear in the model selector.

agents/my-agent/index.ts
import { agent } from "@21st-sdk/agent"

export default agent({
  model: "claude-sonnet-4-6",
  allowedModels: ["claude-sonnet-4-6", "claude-opus-4-6", "claude-haiku-4-5-20251001"],
  systemPrompt: "You are a helpful assistant.",
})

When allowedModels has more than one entry, a model selector appears in the chat UI. The user's choice overrides the default model value for that session.

You can mix native and OpenRouter models in the same allowedModels array — for example, offer DeepSeek for cost and Claude Opus for quality in one agent.

Using OpenRouter models

Through OpenRouter, you can use hundreds of models from Google, Meta, DeepSeek, Alibaba, OpenAI, and others. Pass the OpenRouter model ID directly in your agent config:

agents/my-agent/index.ts
import { agent } from "@21st-sdk/agent"

export default agent({
  model: "deepseek/deepseek-v3.2",
  systemPrompt: "You are a helpful assistant.",
})

OpenRouter handles authentication, billing, and failover across providers. Browse all available models and their IDs at openrouter.ai/models.

You can combine OpenRouter and native models in allowedModels to give users a choice:

agents/my-agent/index.ts
import { agent } from "@21st-sdk/agent"

export default agent({
  model: "deepseek/deepseek-v3.2",
  allowedModels: ["deepseek/deepseek-v3.2", "google/gemini-3.1-pro", "claude-sonnet-4-6"],
  systemPrompt: "You are a helpful assistant.",
})

Popular OpenRouter models

ProviderModelIDContext
GoogleGemini 3.1 Progoogle/gemini-3.1-pro1M
GoogleGemini 3.0 Flashgoogle/gemini-3.0-flash1M
DeepSeekDeepSeek V3.2deepseek/deepseek-v3.2128K
DeepSeekDeepSeek R1deepseek/deepseek-r1128K
MetaLlama 4 Maverickmeta-llama/llama-4-maverick256K
MetaLlama 4 Scoutmeta-llama/llama-4-scout10M
AlibabaQwen 3.5 Maxqwen/qwen-3.5-max128K
OpenAIGPT-5.4openai/gpt-5.4128K

For the full list of 200+ models with live pricing, see openrouter.ai/models.

Choosing the right model

Complex agentic tasks — Claude Opus 4.6 or GPT-5.4. Best for multi-step reasoning and tool use where accuracy matters most.

Production workloads — Claude Sonnet 4.6 or Gemini 3.1 Pro. Best price-to-performance. Sonnet is the default.

Cost-sensitive tasks — DeepSeek V3.2 delivers near-frontier performance at ~10x lower cost. Qwen 3.5 and Llama 4 are strong alternatives.

Massive context — Gemini 3.1 Pro (1M tokens) or Llama 4 Scout (10M tokens) for entire codebases or long documents.

Quick simple tasks — Claude Haiku 4.5 is the fastest and cheapest while still being highly capable.

What's next