OpenAI
OPENAI_API_KEY
in your environment variables. You can use Letta with OpenAI if you have an OpenAI account and API key. Once you have set your OPENAI_API_KEY
in your environment variables, you can select what model and configure the context window size.
Currently, Letta supports the following OpenAI models:
gpt-4
(recommended for advanced reasoning)gpt-4o-mini
(recommended for low latency and cost)gpt-4o
gpt-4-turbo
(not recommended, should usegpt-4o-mini
instead)gpt-3.5-turbo
(not recommended, should usegpt-4o-mini
instead)
Enabling OpenAI models
To enable the OpenAI provider, set your key as an environment variable:
Now, OpenAI models will be enabled with you run letta run
or the letta service.
Configuring OpenAI models
When creating agents, you can specify the exactly model configurations to use, such as the model name and context window size (which can be less than the maximum size).
You can also configure a default LLMConfig
to use for all agents created by the client.
Similarly, you can override the default embedding config by providing a new EmbeddingConfig
object to the set_default_embedding_config
method.