Groq
To use Letta with Groq, set the environment variable
GROQ_API_KEY=...
You can use Letta with Groq if you have a Groq account and API key. Once you have set your GROQ_API_KEY
in your environment variables, you can select what model and configure the context window size.
Enabling Groq as a provider
To enable the Groq provider, you must set the GROQ_API_KEY
environment variable. When this is set, Letta will use available LLM models running on Groq.
Using the docker run
server with Groq
To enable Groq models, simply set your GROQ_API_KEY
as an environment variable:
CLI (pypi only)
Using letta run
and letta server
with Groq
To chat with an agent, run:
This will prompt you to select a model:
To run the Letta server, run:
To select the model used by the server, use the dropdown in the ADE or specify a LLMConfig
object in the Python SDK.