Model providers
Configure LLM providers for your Docker-hosted Letta server
This page shows how to configure different LLM providers when running the Letta server with Docker. Set the appropriate environment variables when starting your container.
Cloud providers
Section titled “Cloud providers”OpenAI
Section titled “OpenAI”docker run \ -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \ -p 8283:8283 \ -e OPENAI_API_KEY="your_openai_api_key" \ letta/letta:latestAnthropic
Section titled “Anthropic”docker run \ -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \ -p 8283:8283 \ -e ANTHROPIC_API_KEY="your_anthropic_api_key" \ letta/letta:latestGoogle AI (Gemini)
Section titled “Google AI (Gemini)”docker run \ -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \ -p 8283:8283 \ -e GEMINI_API_KEY="your_gemini_api_key" \ letta/letta:latestOpenRouter
Section titled “OpenRouter”docker run \ -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \ -p 8283:8283 \ -e OPENROUTER_API_KEY="your_openrouter_api_key" \ letta/letta:latestTogether AI
Section titled “Together AI”docker run \ -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \ -p 8283:8283 \ -e TOGETHER_API_KEY="your_together_api_key" \ letta/letta:latestEnterprise cloud
Section titled “Enterprise cloud”Azure OpenAI
Section titled “Azure OpenAI”docker run \ -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \ -p 8283:8283 \ -e AZURE_API_KEY="your_azure_api_key" \ -e AZURE_BASE_URL="your_azure_base_url" \ -e AZURE_API_VERSION="2024-09-01-preview" \ letta/letta:latestAWS Bedrock
Section titled “AWS Bedrock”docker run \ -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \ -p 8283:8283 \ -e AWS_ACCESS_KEY_ID="your_aws_access_key_id" \ -e AWS_SECRET_ACCESS_KEY="your_aws_secret_access_key" \ -e AWS_DEFAULT_REGION="your_aws_default_region" \ letta/letta:latestGoogle Vertex AI
Section titled “Google Vertex AI”First authenticate with Google Cloud:
gcloud auth application-default loginThen run the container:
docker run \ -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \ -p 8283:8283 \ -e GOOGLE_CLOUD_PROJECT="your-project-id" \ -e GOOGLE_CLOUD_LOCATION="us-central1" \ letta/letta:latestLocal models
Section titled “Local models”Ollama
Section titled “Ollama”macOS/Windows (use host.docker.internal):
docker run \ -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \ -p 8283:8283 \ -e OLLAMA_BASE_URL="http://host.docker.internal:11434/v1" \ letta/letta:latestLinux (use --network host):
docker run \ -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \ --network host \ -e OLLAMA_BASE_URL="http://localhost:11434/v1" \ letta/letta:latestLM Studio
Section titled “LM Studio”macOS/Windows (use host.docker.internal):
docker run \ -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \ -p 8283:8283 \ -e LMSTUDIO_BASE_URL="http://host.docker.internal:1234" \ letta/letta:latestLinux (use --network host):
docker run \ -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \ --network host \ -e LMSTUDIO_BASE_URL="http://localhost:1234" \ letta/letta:latestMultiple providers
Section titled “Multiple providers”You can enable multiple providers by setting multiple environment variables:
docker run \ -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \ -p 8283:8283 \ -e OPENAI_API_KEY="your_openai_api_key" \ -e ANTHROPIC_API_KEY="your_anthropic_api_key" \ -e OLLAMA_BASE_URL="http://host.docker.internal:11434/v1" \ letta/letta:latest