Google AI (Gemini)
To enable Google AI models with Letta, set
GEMINI_API_KEY
in your environment variables. You can use Letta with Google AI if you have a Google API account and API key. Once you have set your GEMINI_API_KEY
in your environment variables, you can select what model and configure the context window size.
Enabling Google AI as a provider
To enable the Google AI provider, you must set the GEMINI_API_KEY
environment variable. When this is set, Letta will use available LLM models running on Google AI.
Using the docker run
server with Google AI
To enable Google Gemini models, simply set your GEMINI_API_KEY
as an environment variable:
CLI (pypi only)
Using letta run
and letta server
with Google AI
To chat with an agent, run:
This will prompt you to select a model:
as we as an embedding model:
To run the Letta server, run:
To select the model used by the server, use the dropdown in the ADE or specify a LLMConfig
object in the Python SDK.