To enable Anthropic models with Letta, set ANTHROPIC_API_KEY in your environment variables.

You can use Letta with Anthropic if you have an Anthropic account and API key. Currently, we support the following models:

  • claude-3-opus-20240229
  • claude-3-sonnet-20240229
  • claude-3-haiku-20240307

Currently, only there are no supported embedding models for Anthropic. You will need to use a seperate provider (e.g. OpenAI) or the Letta embeddings endpoint (letta-free) for embeddings.

Enabling Anthropic models

To enable the Anthropic provider, set your key as an environment variable:

$export ANTHROPIC_API_KEY="sk-ant-..."

Now, Anthropic models will be enabled with you run letta run or the letta service.

Using the docker run server with Anthropic

To enable Anthropic models, simply set your ANTHROPIC_API_KEY as an environment variable:

$# replace `~/.letta/.persist/pgdata` with wherever you want to store your agent data
>docker run \
> -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \
> -p 8283:8283 \
> -e ANTHROPIC_API_KEY="your_anthropic_api_key" \
> letta/letta:latest

Using letta run and letta server with Anthropic

To chat with an agent, run:

$export ANTHROPIC_API_KEY="sk-ant-..."
>letta run

This will prompt you to select an Anthropic model.

? Select LLM model: (Use arrow keys)
» letta-free [type=openai] [ip=https://inference.memgpt.ai]
claude-3-opus-20240229 [type=anthropic] [ip=https://api.anthropic.com/v1]
claude-3-sonnet-20240229 [type=anthropic] [ip=https://api.anthropic.com/v1]
claude-3-haiku-20240307 [type=anthropic] [ip=https://api.anthropic.com/v1]

To run the Letta server, run:

$export ANTHROPIC_API_KEY="sk-ant-..."
>letta server

To select the model used by the server, use the dropdown in the ADE or specify a LLMConfig object in the Python SDK.

Configuring Anthropic models

When creating agents, you must specify the LLM and embedding models to use. You can additionally specify a context window limit (which must be less than or equal to the maximum size). Note that Anthropic does not have embedding models, so you will need to use another provider.

1from letta_client import Letta
2
3client = Letta(base_url="http://localhost:8283")
4
5openai_agent = client.agents.create(
6 model="anthropic/claude-3-5-sonnet-20241022",
7 embedding="openai/text-embedding-3-small",
8 # optional configuration
9 context_window_limit=30000
10)

Anthropic models have very large context windows, which will be very expensive and high latency. We recommend setting a lower context_window_limit when using Anthropic models.

Built with