Data Stream Protocol
Integration with data stream protocol endpoints for streaming AI responses.
The @assistant-ui/react-data-stream package provides integration with data stream protocol endpoints, enabling streaming AI responses with tool support and state management.
Overview
The data stream protocol is a standardized format for streaming AI responses that supports:
- Streaming text responses with real-time updates
- Tool calling with structured parameters and results
- State management for conversation context
- Error handling and cancellation support
- Attachment support for multimodal interactions
Installation
npm install @assistant-ui/react-data-streamyarn add @assistant-ui/react-data-streampnpm add @assistant-ui/react-data-streambun add @assistant-ui/react-data-streamxpm add @assistant-ui/react-data-streamBasic Usage
Set up the Runtime
Use useDataStreamRuntime to connect to your data stream endpoint:
"use client";
import { useDataStreamRuntime } from "@assistant-ui/react-data-stream";
import { AssistantRuntimeProvider } from "@assistant-ui/react";
import { Thread } from "@/components/assistant-ui/thread";
export default function ChatPage() {
const runtime = useDataStreamRuntime({
api: "/api/chat",
});
return (
<AssistantRuntimeProvider runtime={runtime}>
<Thread />
</AssistantRuntimeProvider>
);
}Create Backend Endpoint
Your backend endpoint should accept POST requests and return data stream responses:
import { createAssistantStreamResponse } from "assistant-stream";
export async function POST(request: Request) {
const { messages, tools, system } = await request.json();
return createAssistantStreamResponse(async (controller) => {
// Process the request with your AI provider
const stream = await processWithAI({
messages,
tools,
system,
});
// Stream the response
for await (const chunk of stream) {
controller.appendText(chunk.text);
}
});
}Advanced Configuration
Custom Headers and Authentication
const runtime = useDataStreamRuntime({
api: "/api/chat",
headers: {
"Authorization": "Bearer " + token,
"X-Custom-Header": "value",
},
credentials: "include",
});Dynamic Headers
const runtime = useDataStreamRuntime({
api: "/api/chat",
headers: async () => {
const token = await getAuthToken();
return {
"Authorization": "Bearer " + token,
};
},
});Event Callbacks
const runtime = useDataStreamRuntime({
api: "/api/chat",
onResponse: (response) => {
console.log("Response received:", response.status);
},
onFinish: (message) => {
console.log("Message completed:", message);
},
onError: (error) => {
console.error("Error occurred:", error);
},
onCancel: () => {
console.log("Request cancelled");
},
});Tool Integration
Human-in-the-loop tools (using human() for tool interrupts) are not supported
in the data stream runtime. If you need human approval workflows or interactive
tool UIs, consider using LocalRuntime or
Assistant Cloud instead.
Frontend Tools
Use the frontendTools helper to serialize client-side tools:
import { frontendTools } from "@assistant-ui/react-data-stream";
import { makeAssistantTool } from "@assistant-ui/react";
const weatherTool = makeAssistantTool({
toolName: "get_weather",
description: "Get current weather",
parameters: z.object({
location: z.string(),
}),
execute: async ({ location }) => {
const weather = await fetchWeather(location);
return `Weather in ${location}: ${weather}`;
},
});
const runtime = useDataStreamRuntime({
api: "/api/chat",
body: {
tools: frontendTools({
get_weather: weatherTool,
}),
},
});Backend Tool Processing
Your backend should handle tool calls and return results:
// Tools are automatically forwarded to your endpoint
const { tools } = await request.json();
// Process tools with your AI provider
const response = await ai.generateText({
messages,
tools,
// Tool results are streamed back automatically
});Assistant Cloud Integration
For Assistant Cloud deployments, use useCloudRuntime:
import { useCloudRuntime } from "@assistant-ui/react-data-stream";
const runtime = useCloudRuntime({
cloud: assistantCloud,
assistantId: "my-assistant-id",
});The useCloudRuntime hook is currently under active development and not yet ready for production use.
Message Conversion
The package includes utilities for converting between message formats:
import { toLanguageModelMessages } from "@assistant-ui/react-data-stream";
// Convert assistant-ui messages to language model format
const languageModelMessages = toLanguageModelMessages(messages, {
unstable_includeId: true, // Include message IDs
});Error Handling
The runtime automatically handles common error scenarios:
- Network errors: Automatically retried with exponential backoff
- Stream interruptions: Gracefully handled with partial content preservation
- Tool execution errors: Displayed in the UI with error states
- Cancellation: Clean abort signal handling
Best Practices
Performance Optimization
// Use React.memo for expensive components
const OptimizedThread = React.memo(Thread);
// Memoize runtime configuration
const runtimeConfig = useMemo(() => ({
api: "/api/chat",
headers: { "Authorization": `Bearer ${token}` },
}), [token]);
const runtime = useDataStreamRuntime(runtimeConfig);Error Boundaries
import { ErrorBoundary } from "react-error-boundary";
function ChatErrorFallback({ error, resetErrorBoundary }) {
return (
<div role="alert">
<h2>Something went wrong:</h2>
<pre>{error.message}</pre>
<button onClick={resetErrorBoundary}>Try again</button>
</div>
);
}
export default function App() {
return (
<ErrorBoundary FallbackComponent={ChatErrorFallback}>
<AssistantRuntimeProvider runtime={runtime}>
<Thread />
</AssistantRuntimeProvider>
</ErrorBoundary>
);
}State Persistence
const runtime = useDataStreamRuntime({
api: "/api/chat",
body: {
// Include conversation state
state: conversationState,
},
onFinish: (message) => {
// Save state after each message
saveConversationState(message.metadata.unstable_state);
},
});Examples
- Basic Data Stream Example - Simple streaming chat
- Tool Integration Example - Frontend and backend tools
- Authentication Example - Secure endpoints
API Reference
For detailed API documentation, see the @assistant-ui/react-data-stream API Reference.