We recommend running the Letta service using the docker container.

Running with Docker

1

Download the docker container

To run the docker container, first clone the reposity for pull the contianer.

git clone https://github.com/cpacker/MemGPT
docker pull lettaai/letta:latest
2

Set enviornment variables

Either set the enviornment variables in your shell on in a .env file.

.env

# To use OpenAI 
OPENAI_API_KEY=...  

# To use with Ollama 
LETTA_LLM_ENDPOINT=http://host.docker.internal:11434
LETTA_LLM_ENDPOINT_TYPE=ollama
LETTA_LLM_MODEL=dolphin2.2-mistral:7b-q6_K
LETTA_LLM_CONTEXT_WINDOW=8192
LETTA_EMBEDDING_ENDPOINT=http://host.docker.internal:11434
LETTA_EMBEDDING_ENDPOINT_TYPE=ollama
LETTA_EMBEDDING_MODEL=mxbai-embed-large
LETTA_EMBEDDING_DIM=512

3

View and modify the `compose.yaml` file

You can view and modify the compose.yaml file.

docker compose up 

Now, you can do to http://localhost:8000 to view the ADE, and connect to the Letta server at http://localhost:8283.

Configuring the Docker Container