Wire up the Mastra TypeScript agent framework with assistant-ui.
Mastra is an open-source TypeScript agent framework. It provides primitives for AI applications: agents with memory and tool calling, deterministic LLM workflows, RAG, model routing, workflow graphs, and automated evals.
This is an integration guide, not a runtime adapter. assistant-ui does not ship a @assistant-ui/react-mastra package. You wire up Mastra through the standard AI SDK runtime by routing your API endpoint through Mastra's agent stream.
Pick a pattern
| Pattern | When to pick |
|---|---|
| Full-stack | One Next.js app: API routes call Mastra in-process. Simpler deployment, single repo. |
| Separate server | Mastra runs as its own service; the Next.js frontend hits its API. Independent scaling, clearer separation of concerns. |
Both use the same client-side useChatRuntime from @assistant-ui/react-ai-sdk. The only difference is where the Mastra agent lives.
Architecture
Mastra integrates at the LLM-client layer on the server. assistant-ui talks to your API route via the AI SDK runtime; the route calls agent.stream(messages) and returns the result wrapped in a UI message stream. The client side is built on ExternalStoreRuntime through the AI SDK adapter.
Shared adapters (attachments, speech, feedback, history) work the same way described in adapters. Multi-thread support uses AssistantCloud or a custom thread list.
Requirements
- A Next.js project, or another framework that can run AI SDK route handlers.
- Model API keys (OpenAI, Anthropic, etc.) configured in your environment.
- Node 18 or newer.