The fastest way to get started with the Anthropic backend is to use the quickstart command.
quickstart
letta quickstart --backend anthropic
For more detailed configuration, see the instructions below.
CLI (pypi only)
To configure your LLM model, modify the [model] section of your ~/.letta/config file:
[model]
~/.letta/config
[model] model_endpoint_type = anthropic model = claude-3-opus-20240229 model_endpoint = https://api.anthropic.com/v1 context_window = 200000
Python SDK
You can override the default LLM config (specified in ~/.letta/config) by providing a new LLMConfig object to the set_default_llm_config method.
LLMConfig
set_default_llm_config
from memgpt import LLMConfig # provide full config (llm config) client.set_default_llm_config( LLMConfig( model="claude-3-opus-20240229", model_endpoint_type="anthropic", model_endpoint="https://api.anthropic.com/v1", context_window=200000 ) )