Google Vertex AI
To enable Vertex AI models with Letta, set
GOOGLE_CLOUD_PROJECT
and GOOGLE_CLOUD_LOCATION
in your environment variables. You can use Letta with Vertex AI by configuring your GCP project ID and region.
Enabling Google Vertex AI as a provider
To start, make sure you are authenticated with Google Vertex AI:
To enable the Google Vertex AI provider, you must set the GOOGLE_CLOUD_PROJECT
and GOOGLE_CLOUD_LOCATION
environment variables. You can get these values from the Vertex console.
Using the docker run
server with Google Vertex AI
To enable Google Vertex AI models, simply set your GOOGLE_CLOUD_PROJECT
and GOOGLE_CLOUD_LOCATION
as environment variables:
CLI (pypi only)
Using letta run
and letta server
with Google AI
Make sure you install the required dependencies with:
To chat with an agent, run:
To run the Letta server, run:
To select the model used by the server, use the dropdown in the ADE or specify a LLMConfig
object in the Python SDK.