To enable Anthropic models with Letta, set ANTHROPIC_API_KEY in your environment variables.

You can use Letta with Anthropic if you have an Anthropic account and API key. Currently, we support the following models:

  • claude-3-opus-20240229
  • claude-3-sonnet-20240229
  • claude-3-haiku-20240307

Currently, only there are no supported embedding models for Anthropic. You will need to use a seperate provider (e.g. OpenAI) or the Letta embeddings endpoint (letta-free) for embeddings.

Enabling Anthropic models

To enable the Anthropic provider, set your key as an environment variable:

export ANTHROPIC_API_KEY="sk-ant-..." 

Now, Anthropic models will be enabled with you run letta run or the letta service.

Configuring Anthropic models

When creating agents, you can specify the exactly model configurations to use, such as the model name and context window size (which can be less than the maximum size).

To set a default LLMConfig to claude-3-opus-20240229 with a context window of 200000, you can set the default config for your client:

from letta import create_client, LLMConfig

client = create_client()

# provide full config (llm config)
client.set_default_llm_config(
    LLMConfig(
        model="claude-3-opus-20240229",
        model_endpoint_type="anthropic",
        model_endpoint="https://api.anthropic.com/v1",
        context_window=200000 # NOTE: can be set to <= 200000
    )
)