Adapter for LangChain's `useStream` hook, exposed as an assistant-ui runtime.
@assistant-ui/react-langchain wraps useStream from @langchain/react and exposes it as an assistant-ui runtime. Use this package if you are already integrating your app with @langchain/react and want assistant-ui on top of the upstream hook.
assistant-ui ships two adapters for LangGraph backends:
@assistant-ui/react-langgraphintegrates with@langchain/langgraph-sdkdirectly and exposes features like subgraph events, UI messages, message metadata, and end-to-end cancellation.@assistant-ui/react-langchain(this page) wraps@langchain/react'suseStream. It is lighter-weight and stays aligned with upstream, but currently does not expose everyreact-langgraphfeature.
See the comparison doc for a feature gap table.
Requirements
You need a LangGraph Cloud API server. You can start a server locally via LangGraph Studio or use LangSmith for a hosted version.
The state of the graph you are using must have a messages key with a list of LangChain-alike messages (or pass a custom messagesKey).
Installation
Install dependencies
npm install @assistant-ui/react @assistant-ui/react-langchain @langchain/react @langchain/langgraph-sdkDefine a MyAssistant component
"use client";
import { Thread } from "@/components/assistant-ui/thread";
import { AssistantRuntimeProvider } from "@assistant-ui/react";
import { useStreamRuntime } from "@assistant-ui/react-langchain";
export function MyAssistant() {
const runtime = useStreamRuntime({
assistantId: process.env["NEXT_PUBLIC_LANGGRAPH_ASSISTANT_ID"]!,
apiUrl: process.env["NEXT_PUBLIC_LANGGRAPH_API_URL"],
});
return (
<AssistantRuntimeProvider runtime={runtime}>
<Thread />
</AssistantRuntimeProvider>
);
}Use the MyAssistant component
import { MyAssistant } from "@/components/MyAssistant";
export default function Home() {
return (
<main className="h-dvh">
<MyAssistant />
</main>
);
}Set environment variables
Create a .env.local file in your project with the following variables:
NEXT_PUBLIC_LANGGRAPH_API_URL=http://localhost:2024
NEXT_PUBLIC_LANGGRAPH_ASSISTANT_ID=your_graph_idSet up UI components
Follow the UI Components guide to set up the UI components.
useStreamRuntime options
useStreamRuntime accepts every option useStream from @langchain/react does, plus three assistant-ui-specific fields:
| Option | Type | Description |
|---|---|---|
cloud | AssistantCloud | Optional — persists threads via assistant-cloud. |
adapters | { attachments?, speech?, feedback? } | Optional — attachment, speech, and feedback adapters. |
messagesKey | string | The state key that holds messages. Defaults to "messages". |
Reading custom state keys
LangGraph agents often expose structured state beyond messages (plans, todos, scratch files, generative-UI artifacts). Read them directly with useLangChainState. It mirrors useStream().values[key] upstream and updates when the stream emits new state.
import { useLangChainState } from "@assistant-ui/react-langchain";
type Todo = { id: string; title: string; done: boolean };
function TodoList() {
const todos = useLangChainState<Todo[]>("todos", []);
return (
<ul>
{todos.map((t) => (
<li key={t.id}>
{t.done ? "✓" : "○"} {t.title}
</li>
))}
</ul>
);
}Signatures:
useLangChainState<T>(key: string): T | undefined;
useLangChainState<T>(key: string, defaultValue: T): T;This hook is especially useful with the deepagents middleware, whose write_todos step updates state.todos alongside the tool-call stream. Reading the state key directly avoids reconstructing the list from partial tool-call args.
Added in v0.0.2 — see issue #3862 for motivation.
Interrupts
LangGraph interrupts pause the graph and wait for client input. useLangChainInterruptState exposes the current interrupt; useLangChainSubmit resumes the graph with a raw state update.
import {
useLangChainInterruptState,
useLangChainSubmit,
} from "@assistant-ui/react-langchain";
import { Command } from "@langchain/langgraph-sdk";
function InterruptPrompt() {
const interrupt = useLangChainInterruptState();
const submit = useLangChainSubmit();
if (!interrupt) return null;
return (
<div>
<pre>{JSON.stringify(interrupt.value, null, 2)}</pre>
<button
onClick={() =>
submit(null, { command: new Command({ resume: "approved" }) })
}
>
Approve
</button>
</div>
);
}Message conversion
convertLangChainBaseMessage transforms a LangChain BaseMessage into an assistant-ui message. Use it when building a custom ExternalStoreAdapter that needs to consume LangChain messages outside of useStreamRuntime.
import { convertLangChainBaseMessage } from "@assistant-ui/react-langchain";Cloud persistence
Pass an AssistantCloud instance to persist threads across sessions. The runtime automatically wires thread list management and resumes state from the cloud.
import { AssistantCloud } from "assistant-cloud";
import { useStreamRuntime } from "@assistant-ui/react-langchain";
const cloud = new AssistantCloud({ baseUrl: "/api/cloud" });
const runtime = useStreamRuntime({
cloud,
assistantId: "agent",
apiUrl: "http://localhost:2024",
});Custom messagesKey
If your graph stores messages under a non-default key, pass messagesKey so the runtime submits tool results and human turns to the correct state slot:
const runtime = useStreamRuntime({
assistantId: "agent",
apiUrl: "http://localhost:2024",
messagesKey: "chat_messages",
});