Skip to content
Letta Platform Letta Platform Letta Docs
Sign up

Retrieve Conversation Stream

client.conversations.messages.stream(stringconversationID, MessageStreamParams { agent_id, batch_size, include_pings, 2 more } body?, RequestOptionsoptions?): MessageStreamResponse | Stream<LettaStreamingResponse>
post/v1/conversations/{conversation_id}/stream

Resume the stream for the most recent active run in a conversation.

This endpoint allows you to reconnect to an active background stream for a conversation, enabling recovery from network interruptions.

Agent-direct mode: Pass conversation_id="default" with agent_id in request body to retrieve the stream for the agent's most recent active run.

Deprecated: Passing an agent ID as conversation_id still works but will be removed.

ParametersExpand Collapse
conversationID: string

The conversation identifier. Can be a conversation ID ('conv-'), 'default' for agent-direct mode (with agent_id parameter), or an agent ID ('agent-') for backwards compatibility (deprecated).

minLength1
maxLength42
body: MessageStreamParams { agent_id, batch_size, include_pings, 2 more }
agent_id?: string | null

Agent ID for agent-direct mode with 'default' conversation. Use with conversation_id='default' in the URL path.

batch_size?: number | null

Number of entries to read per batch.

include_pings?: boolean | null

Whether to include periodic keepalive ping messages in the stream to prevent connection timeouts.

poll_interval?: number | null

Seconds to wait between polls when no new data.

starting_after?: number

Sequence id to use as a cursor for pagination. Response will start streaming after this chunk sequence id

ReturnsExpand Collapse
MessageStreamResponse = unknown
Retrieve Conversation Stream
import Letta from '@letta-ai/letta-client';

const client = new Letta({
  apiKey: process.env['LETTA_API_KEY'], // This is the default and can be omitted
});

const response = await client.conversations.messages.stream('default');

console.log(response);
{}
Returns Examples
{}