Anthropic

To enable Anthropic models with Letta, set ANTHROPIC_API_KEY in your environment variables.

You can use Letta with Anthropic if you have an Anthropic account and API key. Currently, only there are no supported embedding models for Anthropic (only LLM models). You will need to use a seperate provider (e.g. OpenAI) or the Letta embeddings endpoint (letta-free) for embeddings.

Enabling Anthropic models with Docker

To enable Anthropic models when running the Letta server with Docker, set your ANTHROPIC_API_KEY as an environment variable:

$# replace `~/.letta/.persist/pgdata` with wherever you want to store your agent data
>docker run \
> -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \
> -p 8283:8283 \
> -e ANTHROPIC_API_KEY="your_anthropic_api_key" \
> letta/letta:latest

See the self-hosting guide for more information on running Letta with Docker.

Specifying agent models

When creating agents on your self-hosted server, you must specify both the LLM and embedding models to use. You can additionally specify a context window limit (which must be less than or equal to the maximum size).

1from letta_client import Letta
2import os
3
4# Connect to your self-hosted server
5client = Letta(base_url="http://localhost:8283")
6
7agent = client.agents.create(
8 model="anthropic/claude-3-5-sonnet-20241022",
9 embedding="openai/text-embedding-3-small", # An embedding model is required for self-hosted
10 # optional configuration
11 context_window_limit=30000
12)

Anthropic models have very large context windows, which will be very expensive and high latency. We recommend setting a lower context_window_limit when using Anthropic models.

For Letta Cloud usage, see the quickstart guide. Cloud deployments manage embeddings automatically and don’t require provider configuration.