Skip to main content

Migration Guide: From providers to model_list

This guide explains how to migrate from the legacy providers configuration to the new model_list format.

Why Migrate?

The new model_list configuration offers several advantages:

  • Zero-code provider addition: Add OpenAI-compatible providers with configuration only
  • Load balancing: Configure multiple endpoints for the same model
  • Protocol-based routing: Use prefixes like openai/, anthropic/, etc.
  • Cleaner configuration: Model-centric instead of vendor-centric

Timeline

VersionStatus
v1.xmodel_list introduced, providers deprecated but functional
v1.x+1Prominent deprecation warnings, migration tool available
v2.0providers configuration removed

Before and After

Before: Legacy providers Configuration

{
"providers": {
"openai": {
"api_key": "sk-your-openai-key",
"api_base": "https://api.openai.com/v1"
},
"anthropic": {
"api_key": "sk-ant-your-key"
},
"deepseek": {
"api_key": "sk-your-deepseek-key"
}
},
"agents": {
"defaults": {
"provider": "openai",
"model": "gpt-5.2"
}
}
}

After: New model_list Configuration

{
"model_list": [
{
"model_name": "gpt4",
"model": "openai/gpt-5.2",
"api_key": "sk-your-openai-key",
"api_base": "https://api.openai.com/v1"
},
{
"model_name": "claude-sonnet-4.6",
"model": "anthropic/claude-sonnet-4.6",
"api_key": "sk-ant-your-key"
},
{
"model_name": "deepseek",
"model": "deepseek/deepseek-chat",
"api_key": "sk-your-deepseek-key"
}
],
"agents": {
"defaults": {
"model": "gpt4"
}
}
}

Protocol Prefixes

The model field uses a protocol prefix format: [protocol/]model-identifier

PrefixDescriptionExample
openai/OpenAI API (default)openai/gpt-5.2
anthropic/Anthropic APIanthropic/claude-opus-4
antigravity/Google via Antigravity OAuthantigravity/gemini-2.0-flash
gemini/Google Gemini APIgemini/gemini-2.0-flash-exp
claude-cli/Claude CLI (local)claude-cli/claude-sonnet-4.6
codex-cli/Codex CLI (local)codex-cli/codex-4
github-copilot/GitHub Copilotgithub-copilot/gpt-4o
openrouter/OpenRouteropenrouter/anthropic/claude-sonnet-4.6
groq/Groq APIgroq/llama-3.1-70b
deepseek/DeepSeek APIdeepseek/deepseek-chat
cerebras/Cerebras APIcerebras/llama-3.3-70b
qwen/Alibaba Qwenqwen/qwen-max
zhipu/Zhipu AIzhipu/glm-4
nvidia/NVIDIA NIMnvidia/llama-3.1-nemotron-70b
ollama/Ollama (local)ollama/llama3
vllm/vLLM (local)vllm/my-model
moonshot/Moonshot AImoonshot/moonshot-v1-8k
shengsuanyun/ShengSuanYunshengsuanyun/deepseek-v3
volcengine/Volcenginevolcengine/doubao-pro-32k

Note: If no prefix is specified, openai/ is used as the default.

ModelConfig Fields

FieldRequiredDescription
model_nameYesUser-facing alias for the model
modelYesProtocol and model identifier (e.g., openai/gpt-5.2)
api_baseNoAPI endpoint URL
api_keyNo*API authentication key
proxyNoHTTP proxy URL
auth_methodNoAuthentication method: oauth, token
connect_modeNoConnection mode for CLI providers: stdio, grpc
rpmNoRequests per minute limit
max_tokens_fieldNoField name for max tokens

*api_key is required for HTTP-based protocols unless api_base points to a local server.

Load Balancing

Configure multiple endpoints for the same model to distribute load:

{
"model_list": [
{
"model_name": "gpt4",
"model": "openai/gpt-5.2",
"api_key": "sk-key1",
"api_base": "https://api1.example.com/v1"
},
{
"model_name": "gpt4",
"model": "openai/gpt-5.2",
"api_key": "sk-key2",
"api_base": "https://api2.example.com/v1"
},
{
"model_name": "gpt4",
"model": "openai/gpt-5.2",
"api_key": "sk-key3",
"api_base": "https://api3.example.com/v1"
}
]
}

When you request model gpt4, requests will be distributed across all three endpoints using round-robin selection.

Adding a New OpenAI-Compatible Provider

With model_list, adding a new provider requires zero code changes:

{
"model_list": [
{
"model_name": "my-custom-llm",
"model": "openai/my-model-v1",
"api_key": "your-api-key",
"api_base": "https://api.your-provider.com/v1"
}
]
}

Just specify openai/ as the protocol (or omit it for the default), and provide your provider's API base URL.

Backward Compatibility

During the migration period, your existing providers configuration will continue to work:

  1. If model_list is empty and providers has data, the system auto-converts internally
  2. A deprecation warning is logged: "providers config is deprecated, please migrate to model_list"
  3. All existing functionality remains unchanged

Migration Checklist

  • Identify all providers you're currently using
  • Create model_list entries for each provider
  • Use appropriate protocol prefixes
  • Update agents.defaults.model to reference the new model_name
  • Test that all models work correctly
  • Remove or comment out the old providers section

Troubleshooting

Model not found error

model "xxx" not found in model_list or providers

Solution: Ensure the model_name in model_list matches the value in agents.defaults.model.

Unknown protocol error

unknown protocol "xxx" in model "xxx/model-name"

Solution: Use a supported protocol prefix. See the Protocol Prefixes table above.

Missing API key error

api_key or api_base is required for HTTP-based protocol "xxx"

Solution: Provide api_key and/or api_base for HTTP-based providers.

Need Help?