Wire Cloudflare's stateful agent framework into a React chat UI with assistant-ui via the standard AI SDK runtime. WebSocket transport, server-side persistence, tool calling, all preserved.
Cloudflare Agents is Cloudflare's framework for stateful AI agents that run on Durable Objects at the edge. Each agent owns its own SQLite-backed message history, exposes a WebSocket channel for low-latency streaming, and can call tools (server-side or client-side).
This is an integration guide, not a runtime adapter. assistant-ui does not ship a @assistant-ui/react-cloudflare-agents package. @cloudflare/ai-chat's useAgentChat returns a structural extension of the AI SDK's useChat, so the existing AI SDK runtime consumes it directly.
Architecture
Cloudflare Agents handles the server half: a Durable Object subclasses AIChatAgent from @cloudflare/ai-chat, owns the message history, and streams responses back over a WebSocket. @cloudflare/ai-chat/react's useAgentChat hook wraps that WebSocket and exposes the same messages, sendMessage, regenerate, status, stop, setMessages, addToolOutput surface that the AI SDK's useChat does, plus a few Cloudflare-specific extras (clearHistory, isServerStreaming, isToolContinuation).
assistant-ui handles the client half. useAISDKRuntime from @assistant-ui/react-ai-sdk reads exactly those AI SDK methods off whatever you pass in, so feeding it useAgentChat's return value yields a fully-featured runtime: streaming, tool calling, edit, reload, history import and export, attachments, suggestions.
Shared adapters (attachments, speech, feedback, history) work the same way as described in adapters. Multi-thread support needs a custom thread list wired around useAISDKRuntime; AssistantCloud integrates via useChatRuntime (which constructs its own useChat internally) and is not compatible with the useAgentChat wiring shown here.
Requirements
- A Cloudflare account with Workers enabled and
wranglerinstalled. - A frontend project (Next.js or any other AI-SDK-compatible React app).
- Model API keys (
OPENAI_API_KEY,ANTHROPIC_API_KEY, etc.) configured as Worker secrets.
Setup
The setup has two halves. Steps 1 to 4 happen in the Worker project (Cloudflare side); steps 5 to 7 happen in a separate assistant-ui frontend. You'll have two dev processes running by the end: wrangler dev for the Worker, and your frontend's dev server.
Scaffold the Worker project
npm create cloudflare@latest my-agent -- --type=hello-world --ts
cd my-agentAdd the Cloudflare Agents packages and the AI SDK:
npm install [email protected] @cloudflare/[email protected] ai@latest @ai-sdk/openai@latestThe two Cloudflare packages above are pinned to exact versions because agents and @cloudflare/ai-chat are pre-1.0 and ship breaking changes between minor releases. See version stability below before bumping them.
Define the agent
AIChatAgent already implements message persistence, streaming protocol, and WebSocket plumbing. Override onChatMessage to plug in your model and tools.
import { AIChatAgent } from "@cloudflare/ai-chat";
import { openai } from "@ai-sdk/openai";
import { streamText, convertToModelMessages } from "ai";
export type Env = {
OPENAI_API_KEY: string;
Chat: DurableObjectNamespace<Chat>;
};
export class Chat extends AIChatAgent<Env> {
async onChatMessage(onFinish: Parameters<typeof streamText>[0]["onFinish"]) {
return streamText({
model: openai("gpt-4o-mini"),
messages: await convertToModelMessages(this.messages),
onFinish,
});
}
}Env is exported alongside Chat so the Worker entry point can reuse the same type. The Chat: DurableObjectNamespace<Chat> field mirrors the binding declared in wrangler.jsonc (next step) and is what routeAgentRequest looks up to resolve the agent. DurableObjectNamespace is a global from @cloudflare/workers-types, which the npm create cloudflare scaffold sets up by default.
this.messages is the persisted history for this Durable Object instance. Each unique agent name you connect with from the client (step 7) gets its own instance and its own message log.
Register the Durable Object and route requests
import { routeAgentRequest } from "agents";
import { Chat, type Env } from "./chat";
export { Chat };
const cors = (request: Request) => ({
"Access-Control-Allow-Origin": request.headers.get("Origin") ?? "*",
"Access-Control-Allow-Headers": "Content-Type, Upgrade",
"Access-Control-Allow-Methods": "GET, POST, OPTIONS",
});
export default {
async fetch(request: Request, env: Env): Promise<Response> {
if (request.method === "OPTIONS") {
return new Response(null, { headers: cors(request) });
}
const upstream =
(await routeAgentRequest(request, env)) ??
new Response("Not found", { status: 404 });
const res = new Response(upstream.body, upstream);
for (const [k, v] of Object.entries(cors(request))) res.headers.set(k, v);
return res;
},
} satisfies ExportedHandler<Env>;routeAgentRequest handles WebSocket upgrades, agent lookup by URL path, and the /get-messages HTTP endpoint that the frontend uses for history rehydration. The cors helper reflects the request origin so the frontend can talk to the Worker across ports during local development. WebSocket upgrades bypass CORS in the browser, but the /get-messages HTTP fetch and any custom routes need these headers. For production, replace the wildcard fallback with an explicit allowlist.
Wire the Durable Object binding in wrangler.jsonc:
{
"name": "my-agent",
"main": "src/index.ts",
"compatibility_date": "2026-01-01",
"compatibility_flags": ["nodejs_compat"],
"durable_objects": {
"bindings": [{ "name": "Chat", "class_name": "Chat" }]
},
"migrations": [
{ "tag": "v1", "new_sqlite_classes": ["Chat"] }
]
}The binding name and class_name must match the exported class. new_sqlite_classes is required so the Durable Object can use SQLite for message storage.
Run the Worker locally
Local wrangler dev reads environment variables from a .dev.vars file in the project root (not from the remote secret store):
OPENAI_API_KEY=sk-...wrangler devFor production, upload the same key as a deployed Worker secret before wrangler deploy:
wrangler secret put OPENAI_API_KEYThe Worker boots on http://localhost:8787. Leave it running.
Initialize the assistant-ui frontend
In a different directory:
npx assistant-ui@latest createnpx assistant-ui@latest initThis creates a default chat page and a local API route at app/api/chat/route.ts. You won't use the local route, since the agent runs on the Worker; delete it once the next step is wired.
Install the Cloudflare client packages
In the frontend project:
npm install [email protected] @cloudflare/[email protected]Wire the runtime
"use client";
import { useAgent } from "agents/react";
import { useAgentChat } from "@cloudflare/ai-chat/react";
import { AssistantRuntimeProvider } from "@assistant-ui/react";
import { useAISDKRuntime } from "@assistant-ui/react-ai-sdk";
import { Thread } from "@/components/assistant-ui/thread";
export const Assistant = () => {
const agent = useAgent({
agent: "Chat",
name: "default",
host: process.env.NEXT_PUBLIC_AGENT_HOST!,
});
const chat = useAgentChat({ agent });
const runtime = useAISDKRuntime(chat);
return (
<AssistantRuntimeProvider runtime={runtime}>
<Thread />
</AssistantRuntimeProvider>
);
};Set the Worker URL in your frontend environment:
NEXT_PUBLIC_AGENT_HOST=http://localhost:8787NEXT_PUBLIC_* exposes the value to the browser. In production, point this at your deployed Worker (e.g. https://my-agent.example.workers.dev).
name: "default" is the Durable Object instance key. Pass a per-user value (a user ID, session ID, or chat ID) to give each user their own persisted history. Switching name from the client opens a new WebSocket connection to a different Durable Object instance.
Notes
Type compatibility with useChat
useAgentChat's return type is Omit<ReturnType<typeof useChat>, "addToolOutput"> & { ... }. The addToolOutput option shape differs slightly between the two: useChat accepts { state, tool, toolCallId, ... }; useAgentChat accepts { state, toolCallId, toolName?, ... }. At runtime the call paths converge through useAISDKRuntime without issue (verified against @cloudflare/[email protected]). If the TypeScript compiler flags the call, cast at the call site: useAISDKRuntime(chat as Parameters<typeof useAISDKRuntime>[0]), or chat as unknown as Parameters<typeof useAISDKRuntime>[0] if TypeScript still refuses the direct cast. (satisfies does not help here; it validates assignability without changing the inferred type, so it surfaces the same error.)
Cloudflare-specific extras
useAgentChat exposes three values that useChat does not:
clearHistory()sends acf_agent_chat_clearframe and wipes the Durable Object's SQLite store. Bind it to a "Clear chat" button if you need server-side history reset;setMessages([])alone only clears the client view.isServerStreamingistruewhile the server is pushing tokens, independent of client-initiated request state. Use it for a universal streaming indicator.isToolContinuationdistinguishes "server auto-continuing after a tool result" from "user just sent a new message". Useful for typing-indicator gating.
Destructure these alongside chat and pass them into your UI directly; they don't need to flow through the runtime.
setMessages round-trips through the Durable Object
useAgentChat overrides setMessages to broadcast the new list over the WebSocket so the Durable Object's SQLite history stays in sync. This means assistant-ui's onImport, onEdit, onReload, and pending-tool cancellation paths all persist server-side automatically. The tradeoff is one extra WebSocket round-trip per mutation, which can race if the connection is lagging; assume eventual consistency, not transactional.
Authenticate the Worker before going to production
routeAgentRequest accepts any client that knows the agent class and name. If you derive name from a user ID (as recommended for per-user history), any client that knows or guesses another user's ID can connect to that Durable Object and read its full message log. Before deploying:
- Gate the fetch handler with a header or cookie check (e.g. a JWT issued by your auth backend), and only call
routeAgentRequestafter the request is authenticated. - Pass the same credential from the frontend via
useAgent'sheadersorqueryoptions so the WebSocket upgrade carries it. - Tighten the CORS
Access-Control-Allow-Originto an explicit allowlist; the wildcard in the example above is for local development only.
Version stability
agents and @cloudflare/ai-chat are pre-1.0 and ship breaking changes between minor versions. Pin both to exact versions in package.json and read the Cloudflare changelog before bumping. The useAgentChat return shape has been additive since 0.3.0, so the integration above should keep working across patch releases.