Anthropic
ANTHROPIC_API_KEY
in your environment variables. You can use Letta with Anthropic if you have an Anthropic account and API key. Currently, we support the following models:
claude-3-opus-20240229
claude-3-sonnet-20240229
claude-3-haiku-20240307
Currently, only there are no supported embedding models for Anthropic. You will need to use a seperate provider (e.g. OpenAI) or the Letta embeddings endpoint (letta-free
) for embeddings.
Enabling Anthropic models
To enable the Anthropic provider, set your key as an environment variable:
Now, Anthropic models will be enabled with you run letta run
or the letta service.
Using the docker run
server with Anthropic
To enable Anthropic models, simply set your ANTHROPIC_API_KEY
as an environment variable:
CLI (pypi only)
Using letta run
and letta server
with Anthropic
To chat with an agent, run:
This will prompt you to select an Anthropic model.
To run the Letta server, run:
To select the model used by the server, use the dropdown in the ADE or specify a LLMConfig
object in the Python SDK.
Configuring Anthropic models
When creating agents, you can specify the exactly model configurations to use, such as the model name and context window size (which can be less than the maximum size).
To set a default LLMConfig
to claude-3-opus-20240229
with a context window of 200000, you can set the default config for your client: