To enable OpenAI models with Letta, set OPENAI_API_KEY in your environment variables.

You can use Letta with OpenAI if you have an OpenAI account and API key. Once you have set your OPENAI_API_KEY in your environment variables, you can select what model and configure the context window size.

Currently, Letta supports the following OpenAI models:

  • gpt-4 (recommended for advanced reasoning)
  • gpt-4o-mini (recommended for low latency and cost)
  • gpt-4o
  • gpt-4-turbo (not recommended, should use gpt-4o-mini instead)
  • gpt-3.5-turbo (not recommended, should use gpt-4o-mini instead)

Enabling OpenAI models

To enable the OpenAI provider, set your key as an environment variable:

export OPENAI_API_KEY=...

Now, OpenAI models will be enabled with you run letta run or the letta service.

Configuring OpenAI models

When creating agents, you can specify the exactly model configurations to use, such as the model name and context window size (which can be less than the maximum size).

1from letta import LLMConfig, EmbeddingConfig
2
3# llm_config specification
4llm_config = LLMConfig(
5 model="gpt-4o-mini",
6 model_endpoint_type="openai",
7 model_endpoint="https://api.openai.com/v1",
8 context_window=128000
9)
10
11# embedding model specification
12embedding_config = EmbeddingConfig(
13 embedding_endpoint_type="openai",
14 embedding_endpoint="https://api.openai.com/v1",
15 embedding_model="text-embedding-ada-002",
16 embedding_dim=1536,
17 embedding_chunk_size=300
18)
1from letta import create_client, LLMConfig, EmbeddingConfig
2
3client = create_client()
4
5# create an agent with a specific model
6agent_state = client.create_agent(
7 llm_config=LLMConfig(
8 model="gpt-4o-mini",
9 model_endpoint_type="openai",
10 model_endpoint="https://api.openai.com/v1",
11 context_window=128000
12 ),
13 embedding_config=EmbeddingConfig.from_default(model_name="text-embedding-ada-002")
14)

You can also configure a default LLMConfig to use for all agents created by the client.

1from letta import create_client
2from letta import LLMConfig, EmbeddingConfig
3
4client = create_client()
5
6# use shorthand (llm config)
7client.set_default_llm_config(
8 LLMConfig.from_default(model_name="gpt-4o-mini")
9)
10
11# provide full config (llm config)
12client.set_default_llm_config(
13 LLMConfig(
14 model="gpt-4o-mini",
15 model_endpoint_type="openai",
16 model_endpoint="https://api.openai.com/v1",
17 context_window=128000
18 )
19)

Similarly, you can override the default embedding config by providing a new EmbeddingConfig object to the set_default_embedding_config method.

1from letta import EmbeddingConfig
2
3# use shorthand (embedding config)
4client.set_default_embedding_config(
5 EmbeddingConfig.from_default(model_name="text-embedding-ada-002")
6)
7
8# provide full config (embedding config)
9client.set_default_embedding_config(
10 EmbeddingConfig(
11 embedding_endpoint_type="openai",
12 embedding_endpoint="https://api.openai.com/v1",
13 embedding_model="text-embedding-ada-002",
14 embedding_dim=1536,
15 embedding_chunk_size=300
16 )
17)
Built with