LM Studio

LM Studio support is currently experimental. If things aren’t working as expected, please reach out to us on Discord!

Models marked as “native tool use” on LM Studio are more likely to work well with Letta.

Setup LM Studio

  1. Download + install LM Studio and the model you want to test with
  2. Make sure to start the LM Studio server

Enabling LM Studio with Docker

To enable LM Studio models when running the Letta server with Docker, set the LMSTUDIO_BASE_URL environment variable.

macOS/Windows: Since LM Studio is running on the host network, you will need to use host.docker.internal to connect to the LM Studio server instead of localhost.

$# replace `~/.letta/.persist/pgdata` with wherever you want to store your agent data
>docker run \
> -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \
> -p 8283:8283 \
> -e LMSTUDIO_BASE_URL="http://host.docker.internal:1234" \
> letta/letta:latest

Linux: Use --network host and localhost:

$docker run \
> -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \
> --network host \
> -e LMSTUDIO_BASE_URL="http://localhost:1234" \
> letta/letta:latest

See the self-hosting guide for more information on running Letta with Docker.

Model support

FYI Models labelled as MLX are only compatible on Apple Silicon Macs

The following models have been tested with Letta as of 7-11-2025 on LM Studio 0.3.18.

  • qwen3-30b-a3b
  • qwen3-14b-mlx
  • qwen3-8b-mlx
  • qwen2.5-32b-instruct
  • qwen2.5-14b-instruct-1m
  • qwen2.5-7b-instruct
  • meta-llama-3.1-8b-instruct

Some models recommended on LM Studio such as mlx-community/ministral-8b-instruct-2410 and bartowski/ministral-8b-instruct-2410 may not work well with Letta due to default prompt templates being incompatible. Adjusting templates can enable compatibility but will impact model performance.