LM Studio
LM Studio support is currently experimental. If things aren’t working as expected, please reach out to us on Discord!
Models marked as “native tool use” on LM Studio are more likely to work well with Letta.
Setup LM Studio
- Download + install LM Studio and the model you want to test with
- Make sure to start the LM Studio server
Enabling LM Studio with Docker
To enable LM Studio models when running the Letta server with Docker, set the LMSTUDIO_BASE_URL environment variable.
macOS/Windows:
Since LM Studio is running on the host network, you will need to use host.docker.internal to connect to the LM Studio server instead of localhost.
Linux:
Use --network host and localhost:
See the self-hosting guide for more information on running Letta with Docker.
Model support
FYI Models labelled as MLX are only compatible on Apple Silicon Macs
The following models have been tested with Letta as of 7-11-2025 on LM Studio 0.3.18.
qwen3-30b-a3bqwen3-14b-mlxqwen3-8b-mlxqwen2.5-32b-instructqwen2.5-14b-instruct-1mqwen2.5-7b-instructmeta-llama-3.1-8b-instruct
Some models recommended on LM Studio such as mlx-community/ministral-8b-instruct-2410 and bartowski/ministral-8b-instruct-2410 may not work well with Letta due to default prompt templates being incompatible. Adjusting templates can enable compatibility but will impact model performance.