OpenAI-compatible endpoint
Connect to OpenAI-compatible API proxies and custom endpoints.
You can configure Letta to use OpenAI-compatible ChatCompletions endpoints by setting OPENAI_API_BASE in your environment variables (in addition to setting OPENAI_API_KEY).
Using with Docker
Section titled “Using with Docker”Set both OPENAI_API_BASE and OPENAI_API_KEY environment variables when you use docker run:
# replace `~/.letta/.persist/pgdata` with wherever you want to store your agent datadocker run \ -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \ -p 8283:8283 \ -e OPENAI_API_BASE="https://your-provider.com/v1" \ -e OPENAI_API_KEY="your_api_key" \ letta/letta:latestSee the self-hosting guide for more information on running Letta with Docker.
Once the Letta server is running, you can select models from your provider via the ADE dropdown or the Python SDK.
For information on how to configure agents to use OpenAI-compatible endpoint providers, refer to our guide on using OpenAI.