Skip to content
Letta Platform Letta Platform Letta Docs
Sign up

OpenAI-compatible endpoint

Connect to OpenAI-compatible API proxies and custom endpoints.

You can configure Letta to use OpenAI-compatible ChatCompletions endpoints by setting OPENAI_API_BASE in your environment variables (in addition to setting OPENAI_API_KEY).

Set both OPENAI_API_BASE and OPENAI_API_KEY environment variables when you use docker run:

Terminal window
# replace `~/.letta/.persist/pgdata` with wherever you want to store your agent data
docker run \
-v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \
-p 8283:8283 \
-e OPENAI_API_BASE="https://your-provider.com/v1" \
-e OPENAI_API_KEY="your_api_key" \
letta/letta:latest

See the self-hosting guide for more information on running Letta with Docker.

Once the Letta server is running, you can select models from your provider via the ADE dropdown or the Python SDK.

For information on how to configure agents to use OpenAI-compatible endpoint providers, refer to our guide on using OpenAI.