# Part 1: Setup frontend
URL: /docs/runtimes/langgraph/tutorial/part-1
Create a Next.js project with the LangGraph assistant-ui template.
Create a new project \[#create-a-new-project]
Run the following command to create a new Next.js project with the LangGraph assistant-ui template:
```sh
npx create-assistant-ui@latest -t langgraph my-app
cd my-app
```
You should see the following files in your project:
import { File, Folder, Files } from "fumadocs-ui/components/files";
Setup environment variables \[#setup-environment-variables]
Create a `.env.local` file in your project with the following variables:
```sh title="@/.env.local"
LANGGRAPH_API_URL=https://assistant-ui-stockbroker.vercel.app/api
NEXT_PUBLIC_LANGGRAPH_ASSISTANT_ID=stockbroker
```
This connects the frontend to a LangGraph Cloud endpoint running under\
`https://assistant-ui-stockbroker.vercel.app/api`.\
This endpoint is running the LangGraph agent defined [in this repository](https://github.com/assistant-ui/assistant-ui-stockbroker/blob/main/backend).
Start the server \[#start-the-server]
You can start the server by running the following command:
```sh
npm run dev
```
The server will start and you can view the frontend by opening a browser tab to [http://localhost:3000](http://localhost:3000).
You should be able to chat with the assistant and see LLM responses streaming in real-time.
Explore features \[#explore-features]
Streaming \[#streaming]
Streaming message support is enabled by default. The LangGraph integration includes sophisticated message handling that efficiently manages streaming responses:
* Messages are accumulated and updated in real-time using `LangGraphMessageAccumulator`
* Partial message chunks are automatically merged using `appendLangChainChunk`
* The runtime handles all the complexity of managing streaming state
This means you'll see tokens appear smoothly as they're generated by the LLM, with proper handling of both text content and tool calls.
Markdown support \[#markdown-support]
Rich text rendering using Markdown is enabled by default.
Add conversation starter messages \[#add-conversation-starter-messages]
In order to help users understand what the assistant can do, we can add some conversation starter messages.
import Image from "next/image";
import starter from "./images/conversation-starters.png";
```tsx title="@/app/page.tsx" {5-17}
export default function Home() {
return (
);
}
```