Skip to content
Letta Code Letta Code Letta Docs
Sign up

Remote environments

Access your agent remotely from any device

Remote Environments are a way to separate where you interact with a Letta agent from where it executes. Using remote environments, you can interact with Letta agents from the Letta Code desktop app or chat.letta.com that run locally on registered machines - meaning you can message an agent working on your laptop from your phone.

The same agent can even move between remote environments within a conversation, allowing agents to work across machines just like a human developer. When an agent moves between execution environments, all its memory (e.g. conversation history, context repositories, etc.) come with it.

You can continue a conversation on a remote sandbox on your local machine, and the agent will still have the same persistent memory and conversation history. You can see more about how Letta agent’s memory work with our latest blog post on context repositories.

Most agents today are defined by their environment: memories, skills, files, and other context are stored locally and must be explicitly transported. In contrast, human developers move between environments without losing context.

In Letta Code, all sessions are tied to a persistent agent that has an identity and associated memory. With remote environments in Letta Code, a single agent can work across:

  • Your laptop
  • An ephemeral sandbox
  • A remote VM (e.g. Railway, GCP)
  1. Install the Letta Code desktop app.

  2. Navigate to the app settings (top-left in MacOS), then enable “Allow remote access”.

  3. You can now deploy agents on the same machine you installed the Letta Code app on, and access them from any device!

    For example, if you enabled remote access on the Letta Code app installed on your home PC, you can now chat with agents running on your home PC via chat.letta.com remotely, as long as the Letta Code app is running on your home PC.

Remote Environments carry the full human-in-the-loop approval flow over WebSocket. When the agent invokes a tool that requires approval, the approval request is surfaced in either the [Letta Code] chat.letta.com. The user can approve, deny, or edit the tool arguments before execution proceeds.

You can run letta server on a cloud VM or container so your agent is always-on. Since letta server only makes an outbound WebSocket connection to Letta Cloud, there are no inbound ports to open, no reverse proxy to configure, and no domain name needed.

There are two ways to authenticate a remote letta server:

Option A: OAuth device flow (recommended). If no API key is set, letta server starts an OAuth login flow and prints an authorization URL to stdout. This is the only authentication method on Pro, Max-lite, and Max plans.

Option B: API key (Developer plans only). Set LETTA_API_KEY as an environment variable. Only available on Developer plans. On other plans, API key usage automatically spends credits.

With OAuth, the server prints an authorization URL on startup:

No API key found. Starting OAuth login...
To authenticate, visit: https://app.letta.com/oauth/device?user_code=ABCD-EFGH
Your code: ABCD-EFGH
Waiting for authorization...

Open the URL in your browser, approve the request, and the server authenticates and connects. Credentials are persisted to disk, so subsequent restarts authenticate automatically. Access tokens are refreshed on startup if expired.

Device flow is available only when letta server is talking to Letta Cloud (https://api.letta.com). Remote environments are a Letta Cloud feature. Self-hosted Letta servers do not currently support the remote environments API.

If you use OAuth on a remote machine or container, make sure ~/.letta/ lives on persistent storage. That is where Letta stores its fallback auth state when no system keychain is available.

# Create a droplet
doctl compute droplet create letta-remote \
--size s-1vcpu-512mb-10gb \
--image ubuntu-24-04-x64 \
--region sfo3 \
--ssh-keys $(doctl compute ssh-key list --format ID --no-header | head -1)

Or create one from the DigitalOcean dashboard — pick Ubuntu 24.04, the $4/mo plan, and your SSH key.

SSH in and install:

ssh root@<droplet-ip>
# Install Node.js 20 + build tools (needed for native modules)
curl -fsSL https://deb.nodesource.com/setup_20.x | bash -
apt-get install -y nodejs python3 make g++
# Install Letta Code
npm install -g @letta-ai/letta-code
# Start the server
letta server --env-name "cloud"

To keep it running across reboots, create a systemd service:

cat > /etc/systemd/system/letta-server.service << 'EOF'
[Unit]
Description=Letta Code Remote Server
After=network-online.target
Wants=network-online.target
[Service]
Type=simple
ExecStart=/usr/bin/letta server --env-name "cloud"
Restart=always
RestartSec=5
[Install]
WantedBy=multi-user.target
EOF
systemctl daemon-reload
systemctl enable --now letta-server

Before enabling the service, run letta server --env-name "cloud" once manually over SSH and complete the OAuth device flow in your browser. The saved auth state under ~/.letta/ is reused on restart. If the service ever needs re-authentication, check journalctl -u letta-server -f for the device code URL. Alternatively, add Environment=LETTA_API_KEY=your-key to the service file (Developer plans only).

Clone the deployment repo:

git clone https://github.com/letta-ai/letta-code-server-deployment.git
cd letta-code-server-deployment

Launch and deploy:

fly launch --name letta-remote --no-deploy
# Create a persistent volume for auth and state
fly volumes create letta_data --region sjc --size 1
fly deploy

The included fly.toml configures the volume mount and sets ENV_NAME to "fly". After deploy, check the logs:

fly logs --app letta-remote

Check the logs for the OAuth authorization URL, visit it in your browser, and approve the request. The mounted volume keeps /root/.letta/ around, so auth survives machine restarts. On Developer plans, you can skip OAuth by setting fly secrets set LETTA_API_KEY="your-key" before deploying.

One-click deploy:

Deploy on Railway

Or manually:

  1. Fork the letta-code-server-deployment repo (or push your own Dockerfile)
  2. Connect the repo in Railway
  3. Add a persistent volume mounted at /root (to preserve auth and state across deploys)
  4. Deploy
  5. Open the deploy logs, find the OAuth URL, and approve it in your browser

Or use the Railway CLI:

railway init
railway up
railway logs

After the first deploy, check the logs for the OAuth authorization URL. The volume at /root preserves auth state across restarts. On Developer plans, you can skip OAuth by setting railway variables set LETTA_API_KEY="your-key" before deploying.

DigitalOceanFly.ioRailway
SetupSSH + 3 commandsDockerfile + CLIGit push or CLI
Cost$4/mo flat~$3/mo usage~$5/mo usage
PersistencesystemdBuilt-inBuilt-in
Best forSimplicity, full controlInfra-as-code workflowsQuick deploys from GitHub

For most users, DigitalOcean is the fastest path: a $4 VM where you SSH in and run three commands. All platforms include ready-to-use configs in the letta-code-server-deployment repo.

SettingLocationDescription
deviceId~/.letta/settings.jsonStable UUID, generated once
listenerEnvName.letta/settings.local.json (per-project)Saved environment name
Remote server state~/.letta/remote-settings.jsonPer-conversation working directories and permission modes restored after restarts
PathDescription
~/.letta/logs/remote/{timestamp}.logPer-session transport log