Skip to content
Sign up

Retrieve Conversation Stream

client.conversations.messages.stream(stringconversationID, MessageStreamParams { batch_size, include_pings, poll_interval, starting_after } body?, RequestOptionsoptions?): MessageStreamResponse | Stream<LettaStreamingResponse>
post/v1/conversations/{conversation_id}/stream

Resume the stream for the most recent active run in a conversation.

This endpoint allows you to reconnect to an active background stream for a conversation, enabling recovery from network interruptions.

ParametersExpand Collapse
conversationID: string

The ID of the conv in the format 'conv-'

minLength41
maxLength41
body: MessageStreamParams { batch_size, include_pings, poll_interval, starting_after }
batch_size?: number | null

Number of entries to read per batch.

include_pings?: boolean | null

Whether to include periodic keepalive ping messages in the stream to prevent connection timeouts.

poll_interval?: number | null

Seconds to wait between polls when no new data.

starting_after?: number

Sequence id to use as a cursor for pagination. Response will start streaming after this chunk sequence id

ReturnsExpand Collapse
MessageStreamResponse = unknown
Retrieve Conversation Stream
import Letta from '@letta-ai/letta-client';

const client = new Letta({
  apiKey: process.env['LETTA_API_KEY'], // This is the default and can be omitted
});

const response = await client.conversations.messages.stream('conv-123e4567-e89b-42d3-8456-426614174000');

console.log(response);
{}
Returns Examples
{}