OpenAI-compatible endpoint
OpenAI proxy endpoints are not officially supported and you are likely to encounter errors. We strongly recommend using providers directly instead of via proxy endpoints (for example, using the Anthropic API directly instead of Claude through OpenRouter). For questions and support you can chat with the dev team and community on our Discord server.
To use OpenAI-compatible (/v1/chat/completions
) endpoints with Letta, those endpoints must support function/tool calling.
You can configure Letta to use OpenAI-compatible ChatCompletions
endpoints by setting OPENAI_API_BASE
in your environment variables (in addition to setting OPENAI_API_KEY
).
OpenRouter example
Create an account on OpenRouter, then create an API key.
Once you have your API key, set both OPENAI_API_KEY
and OPENAI_API_BASE
in your environment variables:
Now, when we run letta run
in the CLI, we can select OpenRouter models from the list of available models:
For information on how to configure the Letta server or Letta Python SDK to use OpenRouter or other OpenAI-compatible endpoints providers, refer to our guide on using OpenAI.