DeepSeek
DEEPSEEK_API_KEY=...
You can use Letta with DeepSeek if you have a DeepSeek account and API key. Once you have set your DEEPSEEK_API_KEY
in your environment variables, you can select what model and configure the context window size.
Please note that R1 doesn’t natively support function calling in DeepSeek API and V3 function calling is unstable, which may result in unstable tool calling inside of Letta agents.
The DeepSeek API for R1 is often down. Please make sure you can connect to DeepSeek API directly by running:
Enabling DeepSeek as a provider
To enable the DeepSeek provider, you must set the DEEPSEEK_API_KEY
environment variable. When this is set, Letta will use available LLM models running on DeepSeek.
Using the docker run
server with DeepSeek
To enable DeepSeek models, simply set your DEEPSEEK_API_KEY
as an environment variable:
CLI (pypi only)
Using letta run
and letta server
with DeepSeek
To chat with an agent, run:
To run the Letta server, run:
To select the model used by the server, use the dropdown in the ADE or specify a LLMConfig
object in the Python SDK.