Models
The Letta API supports all major frontier model providers
Select models shown below - for a complete model list with current pricing, see app.letta.com/models.
| Model | Provider | Max Context | Vision Support? |
|---|---|---|---|
| Anthropic | 200k | ✓ | |
| Anthropic | 200k | ✓ | |
| Anthropic | 200k | ✓ | |
| Anthropic | 200k | ✓ | |
| OpenAI | 272k | ✓ | |
| OpenAI | 272k | ✓ | |
| 1M | ✓ | ||
| 1M | ✓ | ||
| Z.ai | 200k | - | |
| MiniMax | 180k | - | |
| Moonshot | 262k | ✓ |
Bring your own keys (BYOK)
Section titled “Bring your own keys (BYOK)”Connect your own LLM API keys to use additional models from providers like OpenRouter, or to connect external coding plans. When using BYOK instead of hosted models through the Letta API, you are billed directly by the LLM API provider.
To configure BYOK in Letta Code, use the /connect command (see full docs).
Providers can also be configured via the Letta Platform models page.
Configuring models in the Letta API
Section titled “Configuring models in the Letta API”Model handles
Section titled “Model handles”Models are specified using provider/model-name formatted handles:
agent = client.agents.create( model="anthropic/claude-sonnet-4-5-20250929",)const agent = await client.agents.create({ model: "anthropic/claude-sonnet-4-5-20250929",});Providers
Section titled “Providers”| Provider | Prefix | Example |
|---|---|---|
| OpenAI | openai/ | openai/gpt-5.2 |
| Anthropic | anthropic/ | anthropic/claude-sonnet-4-5-20250929 |
| Google AI | google_ai/ | google_ai/gemini-3-flash |
| Azure | azure/ | azure/gpt-4o |
| AWS Bedrock | bedrock/ | bedrock/anthropic.claude-3-5-sonnet |
| OpenRouter | openrouter/ | openrouter/anthropic/claude-3.5-sonnet |
| Ollama | ollama/ | ollama/llama3.2 |
Model settings
Section titled “Model settings”Configure temperature, max tokens, and other settings:
agent = client.agents.create( model="openai/gpt-5.2", model_settings={ "provider_type": "openai", "temperature": 0.7, "max_output_tokens": 4096, }, context_window_limit=128000)const agent = await client.agents.create({ model: "openai/gpt-5.2", model_settings: { provider_type: "openai", temperature: 0.7, max_output_tokens: 4096, }, context_window_limit: 128000,});Reasoning settings
Section titled “Reasoning settings”For OpenAI reasoning models (o1, o3):
agent = client.agents.create( model="openai/o3-mini", model_settings={ "provider_type": "openai", "reasoning": { "reasoning_effort": "medium" # "low", "medium", or "high" } })For Claude extended thinking:
agent = client.agents.create( model="anthropic/claude-sonnet-4-5-20250929", model_settings={ "provider_type": "anthropic", "thinking": { "type": "enabled", "budget_tokens": 10000 } })Changing models
Section titled “Changing models”Update an existing agent’s model:
client.agents.update( agent_id=agent.id, model="openai/gpt-5.2", context_window_limit=64000)await client.agents.update(agent.id, { model: "openai/gpt-5.2", context_window_limit: 64000,});Agents keep their memory and tools when you change models.