Render MCP App UI resources inline in chat. Native renderer for the Model Context Protocol Apps spec — sandboxed iframes, JSON-RPC bridge, AI SDK integration.
MCP Apps lets a Model Context Protocol server ship a UI resource alongside a tool — a self-contained HTML widget that the chat host renders inline when the tool is called. assistant-ui ships a native renderer that mounts the widget in a sandboxed iframe via SafeContentFrame and runs a JSON-RPC postMessage bridge so the widget can call tools, send messages, request a display mode, and read host context.
Overview
When an MCP server attaches a _meta.ui.resourceUri (the text/html;profile=mcp-app MIME) to a tool, AI SDK forwards that metadata through the message stream. assistant-ui's renderer picks it up off the mcp field on ToolCallMessagePart, fetches the resource through your backend route, and mounts it.
The renderer only acts on URIs that start with ui:// (per the MCP Apps spec). Tools whose resourceUri uses any other scheme are treated as non-MCP-Apps tools and fall through to your regular tool UI.
The widget communicates back through a JSON-RPC bridge:
- widget → host requests:
ui/initialize,tools/call,resources/read,resources/list,openLink,sendMessage,requestDisplayMode,updateModelContext - host → widget notifications: tool input streaming, tool result, host context changes
- widget → host notifications: initialized, size changed, log, error, request teardown
Capability presence is determined at mount time by which handlers you provide. Unknown methods return JSON-RPC -32601; bad params return -32602.
Quick start
The renderer talks to a backend route you expose — the MCP client lives server-side so credentials and transport stay out of the browser. The route receives { method, params } POSTs and dispatches to your MCP client.
Client
Compose McpAppRenderer({...}) into your Tools resource. Provide host.url pointing at your route. Any tool-call part carrying mcp.app metadata renders the MCP App widget automatically.
import {
useAui,
Tools,
McpAppRenderer,
McpAppsRemoteHost,
} from "@assistant-ui/react";
function MyAssistant() {
useAui({
tools: Tools({
toolkit: myToolkit,
mcpApp: McpAppRenderer({
host: McpAppsRemoteHost({ url: "/api/mcp-apps" }),
hostInfo: { name: "my-app", version: "1.0.0" },
hostContext: { theme: "light" },
}),
}),
});
// ...
}McpAppsRemoteHost is the default host strategy — it POSTs { method, params } to your route. A different strategy (e.g. a client-side MCP client) can be plugged in by writing a custom resource that returns the same McpAppsHost shape ({ loadResource, callTool, readResource, listResources }).
openLink is auto-wired to window.open(url, "_blank", "noopener,noreferrer"). sendMessage is auto-wired to append a user message to the current thread (accepts string, { prompt }, { text }, or { message }).
Route handler
The route accepts POST requests with { method, params } JSON bodies. Dispatch by method name and return the result as JSON. Example for Next.js App Router:
// app/api/mcp-apps/route.ts
import { experimental_createMCPClient } from "ai";
let clientPromise: ReturnType<typeof experimental_createMCPClient> | undefined;
const getClient = () => {
clientPromise ??= experimental_createMCPClient({
transport: { type: "sse", url: process.env.MCP_SERVER_URL! },
});
return clientPromise;
};
export async function POST(req: Request) {
const { method, params } = await req.json();
const client = await getClient();
switch (method) {
case "mcp-apps/read-resource": {
const { contents } = await client.readResource({ uri: params.uri });
const c = contents.find((x: { uri: string }) => x.uri === params.uri);
return Response.json({
uri: params.uri,
mimeType: "text/html;profile=mcp-app",
html: c?.text ?? "",
});
}
case "tools/call": {
const tools = await client.tools();
const tool = tools[params.name];
if (!tool?.execute) {
return Response.json({ error: "Tool not callable" }, { status: 400 });
}
return Response.json(
await tool.execute(params.arguments ?? {}, {
toolCallId: `mcp-apps-bridge-${crypto.randomUUID()}`,
messages: [],
}),
);
}
case "resources/read":
return Response.json(await client.readResource({ uri: params.uri }));
case "resources/list":
return Response.json(await client.listResources(params));
default:
return Response.json({ error: "Unsupported method" }, { status: 400 });
}
}The renderer POSTs four method names: mcp-apps/read-resource, tools/call, resources/read, resources/list. Reject anything else server-side and apply your own auth / rate limiting in the route.
Per-name setToolUI registrations always win over the MCP fallback — you can still customize specific tools.
AI SDK integration
@assistant-ui/react-ai-sdk forwards callProviderMetadata.mcp.app from AI SDK tool UI parts into ToolCallMessagePart.mcp.app. With AI SDK 5.x and an MCP-Apps-capable MCP server, no extra wiring is required on the part shape.
On the chat route, use splitMcpAppTools() (from @ai-sdk/mcp) to keep app-only tools out of the model's view:
import { splitMcpAppTools } from "@ai-sdk/mcp";
const tools = await client.listTools();
const { modelVisible } = splitMcpAppTools(tools);
const result = streamText({
model: openai("gpt-4o"),
tools: modelVisible.tools,
// ...
});Bridge protocol
The bridge implements the MCP UI JSON-RPC protocol over window.postMessage, filtered by both event.source === frame.iframe.contentWindow AND event.origin === frame.origin — the cross-origin domain SafeContentFrame issues per render. Messages from any other origin or window are dropped silently.
Widget → host requests
| Method | Notes |
|---|---|
ui/initialize | Returns { protocolVersion, host, hostContext, capabilities }. Always supported. |
tools/call | Routed to host.url with method tools/call. Optional handlers.allowedTools allowlist. Invalid arguments shape → -32602. |
resources/read | Routed to host.url with method resources/read. |
resources/list | Routed to host.url with method resources/list. |
openLink | Requires handlers.openLink. Rejects non-http(s) URLs with -32602. |
sendMessage | Requires handlers.sendMessage. |
requestDisplayMode | Requires handlers.requestDisplayMode. Modes: inline, fullscreen, pip. |
updateModelContext | Requires handlers.updateModelContext. |
When a handler isn't provided, the bridge returns JSON-RPC -32601 (method not found) — which is also how capabilities is reported in the ui/initialize response.
Host → widget notifications
notifications/tools/call/input— sent wheneverpart.args(the streaming tool input) changesnotifications/tools/call/result— sent when the tool result lands (including error envelopes)notifications/host_context/changed— sent whenhostContextchanges (e.g. user toggles theme)
Widget → host notifications
notifications/initialized, notifications/size_changed, notifications/log, notifications/error, notifications/request_teardown — wire them via handlers.onInitialized, onSizeChange, onLog, onError, onRequestTeardown respectively.
If the widget never sends notifications/initialized (broken or non-spec-compliant), the host flushes its queued notifications after a 5-second safety timeout so the iframe doesn't appear hung.
Sandboxing
The iframe is built with SafeContentFrame, which serves each widget from a content-hashed cross-origin so the host page is not reachable by same-origin references. Default sandbox flags are allow-same-origin allow-scripts. Tune via the sandbox field on McpAppRendererOptions:
McpAppRenderer({
// ...
sandbox: {
sandbox: ["allow-forms", "allow-popups"],
enableBrowserCaching: true,
className: "my-mcp-app",
},
});Security notes
- Widgets run cross-origin in a sandboxed iframe. The bridge filters incoming messages by both source window and origin.
- The host route is your auth boundary — apply session checks, rate limiting, and per-tool allowlists there. The renderer trusts whatever the route returns.
openLinkrejects non-http(s)URLs at the bridge layer, but youropenLinkhandler should still treat the URL as untrusted (e.g. always usenoopener,noreferrer).- Keep
hostandhandlersreferences stable across renders (e.g. module-scope constants oruseMemo); an unstable identity will tear down and refetch the widget on every parent re-render.