LLM configuration
OpenAI-compatible endpoint
To use OpenAI-compatible (
/v1/chat/completions
) endpoints with Letta, those endpoints must support function/tool calling.You can configure Letta to use OpenAI-compatible ChatCompletions
endpoints by setting OPENAI_API_BASE
in your environment variables (in addition to setting OPENAI_API_KEY
).
OpenRouter example
Create an account on OpenRouter, then create an API key.
Once you have your API key, set both OPENAI_API_KEY
and OPENAI_API_BASE
in your environment variables:
Now, when we run letta run
in the CLI, we can select OpenRouter models from the list of available models:
For information on how to configure the Letta server or Letta Python SDK to use OpenRouter or other OpenAI-compatible endpoints providers, refer to our guide on using OpenAI.