LLM configuration
Cohere
Get Started
- Overview
- Quickstart
- Installation
- LLM configuration
Key concepts
Deployment
Advanced
LLM configuration
Cohere
Configuring with Cohere
To configure your LLM model, modify the [model]
section of your ~/.letta/config
file:
~/.letta/config
[model]
model_endpoint_type = cohere
model = command-r-plus
model_endpoint = https://api.cohere.ai/v1
context_window = 116000
You can override the default LLM config (specified in ~/.letta/config
) by providing a new LLMConfig
object to the set_default_llm_config
method.
from memgpt import LLMConfig
client.set_default_llm_config(
LLMConfig(
model="command-r-plus",
model_endpoint_type="cohere",
model_endpoint="https://api.cohere.ai/v1",
context_window=116000
)
)
On this page