Anthropic
ANTHROPIC_API_KEY
in your environment variables. You can use Letta with Anthropic if you have an Anthropic account and API key. Currently, we support the following models:
claude-3-opus-20240229
claude-3-sonnet-20240229
claude-3-haiku-20240307
Currently, only there are no supported embedding models for Anthropic. You will need to use a seperate provider (e.g. OpenAI) or the Letta embeddings endpoint (letta-free
) for embeddings.
Enabling Anthropic models
To enable the Anthropic provider, set your key as an environment variable:
Now, Anthropic models will be enabled with you run letta run
or the letta service.
Using the docker run
server with Anthropic
To enable Anthropic models, simply set your ANTHROPIC_API_KEY
as an environment variable:
CLI (pypi only)
Using letta run
and letta server
with Anthropic
To chat with an agent, run:
This will prompt you to select an Anthropic model.
To run the Letta server, run:
To select the model used by the server, use the dropdown in the ADE or specify a LLMConfig
object in the Python SDK.
Configuring Anthropic models
When creating agents, you must specify the LLM and embedding models to use. You can additionally specify a context window limit (which must be less than or equal to the maximum size). Note that Anthropic does not have embedding models, so you will need to use another provider.
Anthropic models have very large context windows, which will be very expensive and high latency. We recommend setting a lower context_window_limit
when using Anthropic models.