# Generative UI URL: /docs/guides/tool-ui Render tool calls as interactive UI instead of plain text. *** title: Generative UI description: Render tool calls as interactive UI instead of plain text. ----------------------------------------------------------------------- import { ToolUISample } from "@/components/docs/samples/tool-ui-sample"; Create custom UI components for AI tool calls, providing visual feedback and interactive experiences when tools are executed. ## Overview Tool UIs in assistant-ui allow you to create custom interfaces that appear when AI tools are called. These generative UI components enhance the user experience by: * **Visualizing tool execution** with loading states and progress indicators * **Displaying results** in rich, formatted layouts * **Enabling user interaction** through forms and controls * **Providing error feedback** with helpful recovery options This guide demonstrates building tool UIs with the **Vercel AI SDK**. ## Creating Tool UIs There are two main approaches to creating tool UIs in assistant-ui: ### 1. Client-Defined Tools (`makeAssistantTool`) If you're creating tools on the client side, use `makeAssistantTool` to register them with the assistant context. Then create a UI component with `makeAssistantToolUI`: ```tsx import { makeAssistantTool, tool } from "@assistant-ui/react"; import { z } from "zod"; // Define the tool const weatherTool = tool({ description: "Get current weather for a location", parameters: z.object({ location: z.string(), unit: z.enum(["celsius", "fahrenheit"]), }), execute: async ({ location, unit }) => { const weather = await fetchWeatherAPI(location, unit); return weather; }, }); // Register the tool const WeatherTool = makeAssistantTool({ ...weatherTool, toolName: "getWeather", }); // Create the UI const WeatherToolUI = makeAssistantToolUI< { location: string; unit: "celsius" | "fahrenheit" }, { temperature: number; description: string } >({ toolName: "getWeather", render: ({ args, result, status }) => { if (status.type === "running") { return
Checking weather in {args.location}...
; } return (

{args.location}

{result.temperature}°{args.unit === "celsius" ? "C" : "F"}

{result.description}

); }, }); ``` Tools defined with `makeAssistantTool` can be passed to your backend using the `frontendTools` utility Learn more about creating tools in the [Tools Guide](/docs/guides/tools). ### 2. UI-Only for Existing Tools (`makeAssistantToolUI`) If your tool is defined elsewhere (e.g., in your backend API, MCP server, or LangGraph), use `makeAssistantToolUI` to create just the UI component: ```tsx import { makeAssistantToolUI } from "@assistant-ui/react"; const WeatherToolUI = makeAssistantToolUI< { location: string; unit: "celsius" | "fahrenheit" }, { temperature: number; description: string } >({ toolName: "getWeather", // Must match the backend tool name render: ({ args, result, status }) => { // UI rendering logic only }, }); ``` ## Quick Start Example This example shows how to implement the UI-only approach using `makeAssistantToolUI`: ### Create a Tool UI Component ```tsx import { makeAssistantToolUI } from "@assistant-ui/react"; import { z } from "zod"; type WeatherArgs = { location: string; unit: "celsius" | "fahrenheit"; }; type WeatherResult = { temperature: number; description: string; humidity: number; windSpeed: number; }; const WeatherToolUI = makeAssistantToolUI({ toolName: "getWeather", render: ({ args, status, result }) => { if (status.type === "running") { return (
Checking weather in {args.location}...
); } if (status.type === "incomplete" && status.reason === "error") { return (
Failed to get weather for {args.location}
); } return (

{args.location}

{result.temperature}°{args.unit === "celsius" ? "C" : "F"}

{result.description}

Humidity: {result.humidity}%

Wind: {result.windSpeed} km/h

); }, }); ```
### Register the Tool UI Place the component inside your `AssistantRuntimeProvider`: ```tsx function App() { return ( ); } ``` ### Define the Backend Tool (Vercel AI SDK) When using the Vercel AI SDK, define the corresponding tool in your API route: ```tsx title="/app/api/chat/route.ts" import { streamText, tool } from "ai"; import { z } from "zod"; export async function POST(req: Request) { const { messages } = await req.json(); const result = streamText({ model: openai("gpt-4o"), messages: convertToModelMessages(messages), tools: { getWeather: tool({ description: "Get current weather for a location", inputSchema: z.object({ location: z.string(), unit: z.enum(["celsius", "fahrenheit"]), }), execute: async ({ location, unit }) => { const weather = await fetchWeatherAPI(location); return { temperature: weather.temp, description: weather.condition, humidity: weather.humidity, windSpeed: weather.wind, }; }, }), }, }); return result.toUIMessageStreamResponse(); } ```
## Tool UI Patterns ### Component Pattern Create standalone tool UI components: ```tsx export const WebSearchToolUI = makeAssistantToolUI< { query: string }, { results: SearchResult[] } >({ toolName: "webSearch", render: ({ args, status, result }) => { return (
Search results for: "{args.query}"
{status.type === "running" && } {result && (
{result.results.map((item, index) => (
{item.title}

{item.snippet}

))}
)}
); }, }); ``` ### Hook Pattern Use hooks for dynamic tool UI registration: When you assign your `makeAssistantToolUI({...})` call to a constant starting with `use…`, you can call it directly as a hook inside your component. This pattern lets you access local props or state when rendering the tool UI. ```tsx import { useAssistantToolUI } from "@assistant-ui/react"; function DynamicToolUI() { const [theme, setTheme] = useState("light"); useAssistantToolUI({ toolName: "analyzeData", render: ({ args, result, status }) => { // Hook allows access to component state return ( ); }, }); return null; } ``` ### Inline Pattern For tools that need access to parent component props: **Why `useInlineRender`?** By default, a tool UI's `render` function is static. Use `useInlineRender` when your UI needs access to dynamic component props (for example, to pass in an `id` or other contextual data). ```tsx import { useAssistantToolUI, useInlineRender } from "@assistant-ui/react"; function ProductPage({ productId, productName }) { useAssistantToolUI({ toolName: "checkInventory", render: useInlineRender(({ args, result }) => { // Access parent component props return (

{productName} Inventory

Stock for {productId}: {result.quantity} units

Location: {result.warehouse}

); }), }); return
Product details...
; } ``` ## Interactive Tool UIs ### User Input Collection Create tools that collect user input during execution: **Pro tip:** Call `addResult(...)` exactly once to complete the tool call. After it's invoked, the assistant will resume the conversation with your provided data. ```tsx const DatePickerToolUI = makeAssistantToolUI< { prompt: string }, { date: string } >({ toolName: "selectDate", render: ({ args, result, addResult }) => { if (result) { return (
✅ Selected date: {new Date(result.date).toLocaleDateString()}
); } return (

{args.prompt}

{ addResult({ date: date.toISOString() }); }} />
); }, }); ``` ### Multi-Step Interactions Build complex workflows with human-in-the-loop patterns for multi-step user interactions: ```tsx const DeleteProjectTool = makeAssistantTool({ toolName: "deleteProject", execute: async ({ projectId }, { human }) => { const response = await human({ action, details }); if (!response.approved) throw new Error("Project deletion cancelled"); await deleteProject(projectId); return { success: true }; }, }); const ApprovalTool = makeAssistantTool({ ...tool({ description: "Request user approval for an action", parameters: z.object({ action: z.string(), details: z.any(), }), execute: async ({ action, details }, { human }) => { // Request approval from user const response = await human({ action, details }); return { approved: response.approved, reason: response.reason, }; }, }), toolName: "requestApproval", render: ({ args, result, interrupt, resume }) => { const [reason, setReason] = useState(""); // Show result after approval/rejection if (result) { return (
{result.approved ? "✅ Approved" : `❌ Rejected: ${result.reason}`}
); } // Show approval UI when waiting for user input if (interrupt) { return (

Approval Required

{interrupt.payload.action}

            {JSON.stringify(interrupt.payload.details, null, 2)}
          
setReason(e.target.value)} className="flex-1 rounded border px-2" />
); } return
Processing...
; }, }); ``` Use tool human input (`human()` / `resume()`) for workflows that need to pause tool execution and wait for user input. Use `addResult()` for "human tools" where the AI requests a tool call but the entire execution happens through user interaction. ## Advanced Features ### Tool Status Handling The `status` prop provides detailed execution state: ```tsx render: ({ status, args }) => { switch (status.type) { case "running": return ; case "requires-action": return ; case "incomplete": if (status.reason === "cancelled") { return
Operation cancelled
; } if (status.reason === "error") { return ; } return
Failed: {status.reason}
; case "complete": return ; } }; ``` ### Field-Level Validation Use `useToolArgsFieldStatus` to show validation states: ```tsx import { useToolArgsFieldStatus } from "@assistant-ui/react"; const FormToolUI = makeAssistantToolUI({ toolName: "submitForm", render: ({ args }) => { const emailStatus = useToolArgsFieldStatus("email"); const phoneStatus = useToolArgsFieldStatus("phone"); return (
{emailStatus.type === "incomplete" && ( Invalid email )}
); }, }); ``` ### Partial Results & Streaming Display results as they stream in: ```tsx const AnalysisToolUI = makeAssistantToolUI< { data: string }, { progress: number; insights: string[] } >({ toolName: "analyzeData", render: ({ result, status }) => { const progress = result?.progress || 0; const insights = result?.insights || []; return (
{status.type === "running" && (
Analyzing... {progress}%
)}
{insights.map((insight, i) => (
{insight}
))}
); }, }); ``` ### Custom Tool Fallback Provide a custom UI for tools without specific UIs: ```tsx (
{toolName}({JSON.stringify(args)}) {result && (
{JSON.stringify(result, null, 2)}
)}
), }} /> ``` ## Execution Context Generative UI components have access to execution context through props: ```tsx type ToolUIRenderProps = { // Tool arguments args: TArgs; argsText: string; // JSON stringified args // Execution status status: ToolCallMessagePartStatus; isError?: boolean; // Tool result (may be partial during streaming) result?: TResult; // Tool metadata toolName: string; toolCallId: string; // Interactive callbacks addResult: (result: TResult) => void; resume: (payload: unknown) => void; // Interrupt state interrupt?: { type: "human"; payload: unknown }; // Payload from context.human() // Optional artifact data artifact?: unknown; }; ``` ### Human Input Handling When a tool calls `human()` during execution, the payload becomes available in the render function as `interrupt.payload`: ```tsx const ConfirmationToolUI = makeAssistantToolUI< { action: string }, { confirmed: boolean } >({ toolName: "confirmAction", render: ({ args, result, interrupt, resume }) => { // Tool is waiting for user input if (interrupt) { return (

Confirm: {interrupt.payload.message}

); } // Tool completed if (result) { return
Action {result.confirmed ? "confirmed" : "cancelled"}
; } return
Processing...
; }, }); ``` Learn more about tool human input in the [Tools Guide](/docs/guides/tools#tool-human-input). ## Best Practices ### 1. Handle All Status States Always handle loading, error, and success states: ```tsx render: ({ status, result, args }) => { if (status.type === "running") return ; if (status.type === "incomplete") return ; if (!result) return null; return ; }; ``` ### 2. Provide Visual Feedback Use animations and transitions for better UX: ```tsx
{/* Tool UI content */}
``` ### 3. Make UIs Accessible Ensure keyboard navigation and screen reader support: ```tsx ``` ### 4. Optimize Performance Use `useInlineRender` to prevent unnecessary re-renders: ```tsx useAssistantToolUI({ toolName: "heavyComputation", render: useInlineRender(({ result }) => { // Expensive rendering logic return ; }), }); ``` Generative UI components are only displayed in the chat interface. The actual tool execution happens on the backend. This separation allows you to create rich, interactive experiences while keeping sensitive logic secure on the server. ## Related Guides * [Tools Guide](/docs/guides/tools) - Learn how to create and use tools with AI models * [Tool Fallback](/docs/ui/tool-fallback) - Default UI for tools without custom components * [API Reference](/docs/api-reference/primitives/message-part) - Detailed type definitions and component APIs * [Message Primitive](/docs/api-reference/primitives/message) - Complete Message component documentation