Separate Server Integration
Run Mastra as a standalone server and connect your Next.js frontend (using assistant-ui) to its API endpoints. This approach separates your AI backend from your frontend application, allowing for independent development and scaling.
Create Mastra Server Project
First, create a dedicated project for your Mastra server. Choose a directory separate from your Next.js/assistant-ui frontend project.
Navigate to your chosen parent directory in the terminal and run the Mastra create command:
npx create-mastra@latestThis command will launch an interactive wizard to help you scaffold a new Mastra project, including prompting you for a project name and setting up basic configurations. Follow the prompts to create your server project. For more detailed setup instructions, refer to the official Mastra installation guide.
Once the setup is complete, navigate into your new Mastra project directory (the name you provided during the setup):
cd your-mastra-server-directory # Replace with the actual directory nameYou now have a basic Mastra server project ready.
API Keys
Ensure you have configured your environment variables (e.g., OPENAI_API_KEY)
within this Mastra server project, typically in a .env.development file, as
required by the models you use. The create-mastra wizard might prompt you
for some keys, but ensure all necessary keys for your chosen models are
present.
Define the Agent
Next, let's define an agent within your Mastra server project. We'll create a chefAgent similar to the one used in the full-stack guide.
Open or create the agent file (e.g., src/agents/chefAgent.ts within your Mastra project) and add the following code:
import { openai } from "@ai-sdk/openai";
import { Agent } from "@mastra/core/agent";
export const chefAgent = new Agent({
name: "chef-agent",
instructions:
"You are Michel, a practical and experienced home chef. " +
"You help people cook with whatever ingredients they have available.",
model: openai("gpt-4o-mini"),
});This defines the agent's behavior, but it's not yet active in the Mastra server.
Register the Agent
Now, you need to register the chefAgent with your Mastra instance so the server knows about it. Open your main Mastra configuration file (this is often src/index.ts in projects created with create-mastra).
Import the chefAgent and add it to the agents object when initializing Mastra:
import { Mastra } from "@mastra/core";
import { chefAgent } from "./agents/chefAgent"; // Adjust path if necessary
export const mastra = new Mastra({
agents: { chefAgent },
});Make sure you adapt this code to fit the existing structure of your src/index.ts file generated by create-mastra. The key is to import your agent and include it in the agents configuration object.
Run the Mastra Server
With the agent defined and registered, start the Mastra development server:
npm run devBy default, the Mastra server will run on http://localhost:4111. Your chefAgent should now be accessible via a POST request endpoint, typically http://localhost:4111/api/agents/chefAgent/stream. Keep this server running for the next steps where we'll set up the assistant-ui frontend to connect to it.
Initialize assistant-ui Frontend
Now, set up your frontend application using assistant-ui. Navigate to a different directory from your Mastra server project. You can either create a new Next.js project or use an existing one.
Inside your frontend project directory, run one of the following commands:
npx assistant-ui@latest createnpx assistant-ui@latest initThis command installs the necessary assistant-ui dependencies and sets up basic configuration files, including a default chat page and an API route (app/api/chat/route.ts).
Need Help?
For detailed setup instructions for assistant-ui, including manual setup steps, please refer to the main Getting Started guide.
In the next step, we will configure this frontend to communicate with the separate Mastra server instead of using the default API route.
Configure Frontend API Endpoint
The default assistant-ui setup configures the chat runtime to use a local API route (/api/chat) within the Next.js project. Since our Mastra agent is running on a separate server, we need to update the frontend to point to that server's endpoint.
Open the main page file in your assistant-ui frontend project (usually app/page.tsx or src/app/page.tsx). Find the useChatRuntime hook and change the api property to the full URL of your Mastra agent's stream endpoint:
"use client";
import { Thread } from "@/components/assistant-ui/thread";
import {
useChatRuntime,
AssistantChatTransport,
} from "@assistant-ui/react-ai-sdk";
import { AssistantRuntimeProvider } from "@assistant-ui/react";
import { ThreadList } from "@/components/assistant-ui/thread-list";
export default function Home() {
// Point the runtime to the Mastra server endpoint
const runtime = useChatRuntime({
transport: new AssistantChatTransport({
api: "MASTRA_ENDPOINT",
}),
});
return (
<AssistantRuntimeProvider runtime={runtime}>
<main className="grid h-dvh grid-cols-[200px_1fr] gap-x-2 px-4 py-4">
<ThreadList />
<Thread />
</main>
</AssistantRuntimeProvider>
);
}Replace "http://localhost:4111/api/agents/chefAgent/stream" with the actual URL if your Mastra server runs on a different port or host, or if your agent has a different name.
Now, the assistant-ui frontend will send chat requests directly to your running Mastra server.
Delete Default API Route
Since the frontend no longer uses the local /api/chat route created by the
init command, you can safely delete the app/api/chat/route.ts (or
src/app/api/chat/route.ts) file from your frontend project.
Run the Frontend Application
You're ready to connect the pieces! Make sure your separate Mastra server is still running (from Step 4).
In your assistant-ui frontend project directory, start the Next.js development server:
npm run devOpen your browser to http://localhost:3000 (or the port specified in your terminal for the frontend app). You should now be able to interact with your chefAgent through the assistant-ui chat interface. The frontend will make requests to your Mastra server running on http://localhost:4111.
Congratulations! You have successfully integrated Mastra with assistant-ui using a separate server approach. Your assistant-ui frontend now communicates with a standalone Mastra agent server.
This setup provides a clear separation between your frontend and AI backend. To explore more advanced Mastra features like memory, tools, workflows, and deployment options, please refer to the official Mastra documentation.