Streams v2 requires SDK version 4.1.0 or later. Make sure to upgrade your
@trigger.dev/sdk
and @trigger.dev/react-hooks packages to use these features. If you’re on an earlier version,
see the metadata.stream() documentation.Overview
Streams v2 is a major upgrade that provides:- Unlimited stream length (previously capped at 2000 chunks)
- Unlimited active streams per run (previously 5)
- Improved reliability with automatic resumption on connection loss
- 28-day stream retention (previously 1 day)
- Multiple client streams can pipe to a single stream
- Enhanced dashboard visibility for viewing stream data in real-time
Enabling Streams v2
Streams v2 is automatically enabled when triggering runs from the SDK using 4.1.0 or later. If you aren’t triggering via the SDK, you’ll need to explicitly enable v2 streams via setting thex-trigger-realtime-streams-version=v2 header when triggering the task.
If you’d like to opt-out of the v2 streams, you can see so in one of the following two ways:
Option 1: Configure the SDK
Option 2: Environment Variable
Set theTRIGGER_V2_REALTIME_STREAMS=0 environment variable in your backend code (where you trigger tasks).
Limits Comparison
| Limit | Streams v1 | Streams v2 |
|---|---|---|
| Maximum stream length | 2000 | Unlimited |
| Number of active streams per run | 5 | Unlimited |
| Maximum streams per run | 10 | Unlimited |
| Maximum stream TTL | 1 day | 28 days |
| Maximum stream size | 10MB | 300 MiB |
Quick Start
The recommended workflow for using Realtime Streams v2:- Define your streams in a shared location using
streams.define() - Use the defined stream in your tasks with
.pipe(),.append(), or.writer() - Read from the stream using
.read()or theuseRealtimeStreamhook in React
Defining Typed Streams (Recommended)
The recommended way to work with streams is to define them once withstreams.define(). This allows you to specify the chunk type and stream ID in one place, and then reuse that definition throughout your codebase with full type safety.
Creating a Defined Stream
Define your streams in a shared location (likeapp/streams.ts or trigger/streams.ts):
Using Defined Streams in Tasks
Once defined, you can use all stream methods on your defined stream:Reading from a Stream
Use the defined stream’sread() method to consume data from anywhere (frontend, backend, or another task):
Appending to a Stream
Use the defined stream’sappend() method to add a single chunk:
Writing Multiple Chunks
Use the defined stream’swriter() method for more complex stream writing:
Using Defined Streams in React
Defined streams work seamlessly with theuseRealtimeStream hook:
Direct Stream Methods (Without Defining)
If you have a specific reason to avoid defined streams, you can use stream methods directly by specifying the stream key each time.Direct Piping
Direct Reading
Direct Appending
Direct Writing
Default Stream
Every run has a “default” stream, allowing you to skip the stream key entirely. This is useful for simple cases where you only need one stream per run. Using direct methods:Targeting Different Runs
You can pipe streams to parent, root, or any other run using thetarget option. This works with both defined streams and direct methods.
With Defined Streams
With Direct Methods
Streaming from Outside a Task
If you specify atarget run ID, you can pipe streams from anywhere (like a Next.js API route):
React Hook
Use theuseRealtimeStream hook to subscribe to streams in your React components.
With Defined Streams (Recommended)
With Direct Stream Keys
If you prefer not to use defined streams, you can specify the stream key directly:Using Default Stream
Hook Options
AI SDK useChat transport with Trigger.dev tasks
If you want to use AI SDK UI’s useChat() on the frontend and run the backend as a Trigger.dev task,
use the @trigger.dev/ai transport.
Install
Define a typed stream
Create a task that accepts rich chat transport payload
Use useChat() with Trigger chat transport
chatIdtrigger("submit-message"or"regenerate-message")messageIdmessagesrequest(headers,body, andmetadata)
Advanced transport options
TriggerChatTransport also supports:
payloadMapper(sync or async) for custom task payload shapestriggerOptionsas an object or resolver function (sync or async)runStorefor custom reconnect-state persistence (including async stores)onTriggeredRuncallback (sync or async) to persist or observe run IDsonErrorcallback to observe non-fatal transport issues- headers passed through transport can be object,
Headers, or tuple arrays
onError receives phase-aware details (payloadMapper, triggerOptions, triggerTask,
streamSubscribe, onTriggeredRun, consumeTrackingStream, reconnect) plus chatId,
optional runId, and the underlying error (non-Error throws are normalized to Error
instances).
Run-store cleanup is handled as best effort, and cleanup failures won’t mask the original
transport failure that triggered onError. Cleanup still attempts both persistence steps
(set inactive state and delete) even when one step fails.
reconnectToStream() only resumes active streams. When a stream completes or errors,
the transport clears stored run state and future reconnect attempts return null.
If stale inactive reconnect state cannot be cleaned up, reconnect still returns null and
the failure is surfaced through onError with phase reconnect.
Subsequent reconnect calls will retry stale inactive-state cleanup until it succeeds.
If onError is omitted, reconnect still returns null and continues without callback reporting.
baseURL defaults to https://api.trigger.dev when omitted.
It supports optional path prefixes and trailing slashes; both trigger and stream URLs
are normalized consistently, surrounding whitespace is trimmed before normalization, and
the resulting value must not be empty. The value must also be a valid absolute URL using
the http or https protocol, without query parameters, hash fragments, or embedded
username/password credentials.
Protocol matching is case-insensitive (HTTP://... and HTTPS://... are accepted).
Examples:
- ✅
https://api.trigger.dev - ✅
https://api.trigger.dev/custom-prefix - ✅
https://api.trigger.dev/custom-prefix///(trimmed + normalized) - ✅
\n\thttps://api.trigger.dev/custom-prefix/\t\n(newline/tab wrappers trimmed) - ✅
https://api.trigger.dev/custom%20prefix(percent-encoded whitespace) - ✅
https://api.trigger.dev/custom%3Fprefix%23segment(percent-encoded?/#) - ✅
\u00A0https://api.trigger.dev/custom-prefix/\u00A0(non-breaking-space wrapper trimmed) - ✅
\u1680https://api.trigger.dev/custom-prefix/\u1680(ogham-space-mark wrapper trimmed) - ✅
\u2007https://api.trigger.dev/custom-prefix/\u2007(figure-space wrapper trimmed) - ✅
\u200Ahttps://api.trigger.dev/custom-prefix/\u200A(hair-space wrapper trimmed) - ✅
\u2009https://api.trigger.dev/custom-prefix/\u2009(thin-space wrapper trimmed) - ✅
\u2008https://api.trigger.dev/custom-prefix/\u2008(punctuation-space wrapper trimmed) - ✅
\u2006https://api.trigger.dev/custom-prefix/\u2006(six-per-em-space wrapper trimmed) - ✅
\u2003https://api.trigger.dev/custom-prefix/\u2003(em-space wrapper trimmed) - ✅
\u205Fhttps://api.trigger.dev/custom-prefix/\u205F(medium-mathematical-space wrapper trimmed) - ✅
\u3000https://api.trigger.dev/custom-prefix/\u3000(ideographic-space wrapper trimmed) - ✅
\uFEFFhttps://api.trigger.dev/custom-prefix/\uFEFF(BOM wrapper trimmed) - ❌
\u2060https://api.trigger.dev/custom-prefix/\u2060(word-joiner wrappers are rejected) - ❌
\u200Bhttps://api.trigger.dev/custom-prefix/\u200B(zero-width-space wrappers are rejected) - ❌
\u200Chttps://api.trigger.dev/custom-prefix/\u200C(zero-width-non-joiner wrappers are rejected) - ❌
\u200Dhttps://api.trigger.dev/custom-prefix/\u200D(zero-width-joiner wrappers are rejected) - ❌
\u180Ehttps://api.trigger.dev/custom-prefix/\u180E(mongolian-vowel-separator wrappers are rejected) - ❌
https://api.trigger.dev?foo=bar - ❌
https://api.trigger.dev#fragment - ❌
https://user:pass@api.trigger.dev - ❌
ftp://api.trigger.dev - ❌
ws://api.trigger.dev/wss://api.trigger.dev - ❌
\u1680///\u1680(empty after trimming wrapper whitespace) - ❌
\u2007///\u2007(empty after trimming wrapper whitespace) - ❌
\u205F///\u205F(empty after trimming wrapper whitespace) - ❌
\u180E///\u180E(rejected as internal invisible-separator whitespace) - ❌
\u3000///\u3000(empty after trimming wrapper whitespace) - ❌
\n\thttps://api.trigger.dev/base/?query=1\t\n(query is still rejected after trimming wrappers) - ❌
\n\thttps://api.trigger.dev/base/#fragment\t\n(hash is still rejected after trimming wrappers) - ❌
\n\thttps://user:pass@api.trigger.dev/base/\t\n(credentials are still rejected after trimming wrappers) - ❌
\n\tws://api.trigger.dev\t\n/\n\twss://api.trigger.dev\t\n(trimmed wrappers still reject websocket protocols) - ❌
https://api.trigger.dev/\ninternal - ❌
https://api.trigger.dev/in valid - ❌
https://api.trigger.dev/\tinternal - ❌
https://api.trigger.dev/\vinternal - ❌
https://api.trigger.dev/\finternal - ❌
https://api.trigger.dev/\rinternal - ❌
https://api.trigger.dev/\u200Binternal - ❌
https://api.trigger.dev/\u200Cinternal - ❌
https://api.trigger.dev/\u200Dinternal - ❌
https://api.trigger.dev/\u1680internal - ❌
https://api.trigger.dev/\u2007internal - ❌
https://api.trigger.dev/\u200Ainternal - ❌
https://api.trigger.dev/\u2009internal - ❌
https://api.trigger.dev/\u2008internal - ❌
https://api.trigger.dev/\u2006internal - ❌
https://api.trigger.dev/\u2003internal - ❌
https://api.trigger.dev/\u202Finternal - ❌
https://api.trigger.dev/\u205Finternal - ❌
https://api.trigger.dev/\u180Einternal - ❌
https://api.trigger.dev/\u3000internal - ❌
https://api.trigger.dev/\u2028internal - ❌
https://api.trigger.dev/\u2029internal - ❌
https://api.trigger.dev/\u2060internal - ❌
https://api.trigger.dev/\uFEFFinternal
baseURL must not be emptybaseURL must be a valid absolute URLbaseURL must not contain internal whitespace charactersbaseURL must use http or https protocolbaseURL must not include query parameters or hash fragmentsbaseURL must not include username or password credentials
\u200B, \u200C, \u200D, \u2060, and \uFEFF.
When multiple issues are present, validation order is deterministic:
internal whitespace → protocol → query/hash → credentials.
Examples of ordering:
ftp://example.com?x=1→baseURL must use http or https protocolhttps://user:pass@example.com?x=1→baseURL must not include query parameters or hash fragmentsftp://user:pass@example.com/in valid?x=1→baseURL must not contain internal whitespace charactersftp://user:pass@example.com/\u2060invalid?x=1#fragment→baseURL must not contain internal whitespace charactersftp://user:pass@example.com/\u180Einvalid?x=1#fragment→baseURL must not contain internal whitespace characters
@trigger.dev/ai also exports:
TriggerChatHeadersInputTriggerChatSendMessagesOptionsTriggerChatReconnectOptions
Complete Example: AI Streaming
Define the stream
Create the task
Frontend component
Migration from v1
If you’re using the oldmetadata.stream() API, here’s how to migrate to the recommended v2 approach:
Step 1: Define Your Streams
Create a shared streams definition file:Step 2: Update Your Tasks
Replacemetadata.stream() with the defined stream’s pipe() method:
Step 3: Update Your Frontend
Use the defined stream withuseRealtimeStream:
Alternative: Direct Methods (Not Recommended)
If you prefer not to use defined streams, you can use direct methods:Reliability Features
Streams v2 includes automatic reliability improvements:- Automatic resumption: If a connection is lost, both appending and reading will automatically resume from the last successful chunk
- No data loss: Network issues won’t cause stream data to be lost
- Idempotent operations: Duplicate chunks are automatically handled
Dashboard Integration
Streams are now visible in the Trigger.dev dashboard, allowing you to:- View stream data in real-time as it’s generated
- Inspect historical stream data for completed runs
- Debug streaming issues with full visibility into chunk delivery
Best Practices
- Always use
streams.define(): Define your streams in a shared location for better organization, type safety, and code reusability. This is the recommended approach for all streams. - Export stream types: Use
InferStreamTypeto export types for your frontend components - Handle errors gracefully: Always check for errors when reading streams in your UI
- Set appropriate timeouts: Adjust
timeoutInSecondsbased on your use case (AI completions may need longer timeouts) - Target parent runs: When orchestrating with child tasks, pipe to parent runs for easier consumption
- Throttle frontend updates: Use
throttleInMsinuseRealtimeStreamto prevent excessive re-renders - Use descriptive stream IDs: Choose clear, descriptive IDs like
"ai-output"or"progress"instead of generic names
Troubleshooting
Stream not appearing in dashboard
- Ensure you’ve enabled Streams v2 via the future flag or environment variable
- Verify your task is actually writing to the stream
- Check that the stream key matches between writing and reading
Stream timeout errors
- Increase
timeoutInSecondsin yourread()oruseRealtimeStream()calls - Ensure your stream source is actively producing data
- Check network connectivity between your application and Trigger.dev
Missing chunks
- With v2, chunks should never be lost due to automatic resumption
- Verify you’re reading from the correct stream key
- Check the
startIndexoption if you’re not seeing expected chunks

