# OpenUI — The Open Standard for Generative UI
OpenUI is a full-stack Generative UI framework consisting of a compact streaming-first language (OpenUI Lang), a React runtime with built-in component libraries, and ready-to-use chat interfaces. It enables AI models to generate structured, interactive UI components instead of plain text or JSON, achieving up to 67% fewer tokens than equivalent JSON-based approaches. The framework covers the entire pipeline: defining a component library with Zod schemas, generating a system prompt from that library, streaming OpenUI Lang output from an LLM, and progressively rendering parsed components in React as tokens arrive.
The SDK is organized into four packages: `@openuidev/react-lang` (core runtime — component definitions, parser, renderer, and prompt generation), `@openuidev/react-headless` (headless chat state, streaming adapters, and message format converters), `@openuidev/react-ui` (prebuilt chat layouts and two ready-to-use component libraries), and `@openuidev/cli` (CLI for scaffolding apps and generating system prompts). A fifth package, `@openuidev/lang-core`, provides a framework-agnostic `generatePrompt` function for server-side prompt construction without React dependencies.
---
## `@openuidev/cli` — Scaffold and generate
### `openui create` — scaffold a new Next.js chat app
Creates a Next.js app pre-configured with OpenUI Chat, auto-detects the package manager, installs dependencies, and optionally installs the OpenUI agent skill.
```bash
# Interactive — prompts for project name and agent skill
npx @openuidev/cli@latest create
# Non-interactive
npx @openuidev/cli@latest create --name my-app --no-skill
# With agent skill pre-installed for Claude Code / Cursor / Copilot
npx @openuidev/cli@latest create --name my-app --skill
# The generated app is immediately runnable
cd my-app
echo "OPENAI_API_KEY=sk-your-key-here" > .env
npm run dev # http://localhost:3000
```
### `openui generate` — generate a system prompt or JSON schema from a library file
Bundles the entry file with esbuild, evaluates the exported `Library`, and writes the system prompt (or JSON schema) to stdout or a file. Auto-detects the library export and any `PromptOptions` export.
```bash
# Print system prompt to stdout
npx @openuidev/cli@latest generate ./src/library.ts
# Write system prompt to file (typical prebuild step)
npx @openuidev/cli@latest generate ./src/library.ts --out src/generated/system-prompt.txt
# Output JSON schema for use with generatePrompt() at runtime
npx @openuidev/cli@latest generate ./src/library.ts --json-schema --out generated/component-spec.json
# Explicit export names when auto-detection is ambiguous
npx @openuidev/cli@latest generate ./src/library.ts --export myLibrary --prompt-options myOptions
```
Add it as a prebuild step so the prompt stays in sync with the component library:
```json
{
"scripts": {
"generate:prompt": "openui generate src/library.ts --out src/generated/system-prompt.txt",
"dev": "pnpm generate:prompt && next dev",
"build": "pnpm generate:prompt && next build"
}
}
```
---
## `@openuidev/react-lang` — Core runtime
### `defineComponent(config)` — register a UI component
Defines a single component with a name, Zod schema, natural-language description, and React renderer. Key order in the `z.object(...)` schema determines positional argument order in OpenUI Lang output. The returned `DefinedComponent` exposes a `.ref` for use in parent schemas.
```tsx
import { defineComponent, createLibrary } from "@openuidev/react-lang";
import { z } from "zod/v4";
// Leaf component
const StatCard = defineComponent({
name: "StatCard",
description: "Displays a metric label and its current value.",
props: z.object({
label: z.string(),
value: z.string(),
trend: z.enum(["up", "down", "neutral"]).optional(),
}),
component: ({ props }) => (
{props.label}
{props.value}
),
});
// Container that nests children via .ref
const Dashboard = defineComponent({
name: "Dashboard",
description: "Full-page dashboard container.",
props: z.object({
title: z.string(),
cards: z.array(StatCard.ref),
}),
component: ({ props, renderNode }) => (
{props.title}
{renderNode(props.cards)}
),
});
```
### `createLibrary(input)` — assemble a component library
Accepts an array of defined components, optional component groups (for prompt organization), and a root component name. Returns a `Library` with `prompt()`, `toJSONSchema()`, and `toSpec()` methods.
```ts
import { createLibrary } from "@openuidev/react-lang";
import type { PromptOptions } from "@openuidev/react-lang";
export const library = createLibrary({
root: "Dashboard", // LLM always starts with: root = Dashboard(...)
components: [StatCard, Dashboard],
componentGroups: [
{
name: "Metrics",
components: ["StatCard"],
notes: [
"- Always show a trend indicator when data is available.",
"- Use 'large-heavy' text style for the value.",
],
},
{
name: "Layout",
components: ["Dashboard"],
},
],
});
// Inline prompt generation (imports React — use generatePrompt for pure backends)
export const promptOptions: PromptOptions = {
preamble: "You are a dashboard builder. Always output structured UI.",
additionalRules: ["Prefer compact number formats (1.2k instead of 1200)."],
examples: [
`root = Dashboard("Revenue", [revenue, users])\nrevenue = StatCard("Revenue", "$12.4k", "up")\nusers = StatCard("Users", "1,200", "up")`,
],
toolCalls: true, // enables Query(), Mutation(), @Run
bindings: true, // enables $variables, @Set, @Reset
};
const systemPrompt = library.prompt(promptOptions);
```
### ` ` — parse and render OpenUI Lang streams
Parses OpenUI Lang text and renders nodes using the provided library. Re-parses incrementally as streaming chunks arrive; forward references resolve when their statements arrive.
```tsx
import { Renderer } from "@openuidev/react-lang";
import { library } from "@/lib/library";
function AssistantMessage({
content,
isStreaming,
}: {
content: string | null;
isStreaming: boolean;
}) {
return (
fetch("/api/tickets").then((r) => r.json()),
create_ticket: async (args) =>
fetch("/api/tickets", {
method: "POST",
body: JSON.stringify(args),
}).then((r) => r.json()),
}}
// Receive interactive actions (form submits, button clicks)
onAction={(event) => {
if (event.type === "continue_conversation") {
sendToChat(event.humanFriendlyMessage, event.formState);
}
}}
// Structured errors suitable for LLM self-correction loops
onError={(errors) => {
if (!errors.length) return;
const msg = errors
.map(
(e) =>
`[${e.source}] ${e.statementId ? `"${e.statementId}": ` : ""}${e.message}${e.hint ? `\nHint: ${e.hint}` : ""}`,
)
.join("\n\n");
sendToChat(`Fix these errors:\n\n${msg}`);
}}
// Persist and restore form state across messages
initialState={savedFormState}
onStateUpdate={(state) => saveFormState(state)}
/>
);
}
```
### `generatePrompt` / `@openuidev/lang-core` — server-side prompt generation
Framework-agnostic prompt builder. Use this in Node/Edge/serverless routes that must not import React components.
```ts
import { generatePrompt } from "@openuidev/lang-core";
import componentSpec from "./generated/component-spec.json"; // from: openui generate --json-schema
import OpenAI from "openai";
const client = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });
const systemPrompt = generatePrompt({
...componentSpec,
tools: [
{ name: "list_tickets", description: "Return open support tickets." },
{ name: "create_ticket", description: "Create a new ticket." },
],
toolExamples: [
`tickets = Query("list_tickets", {}, {rows: []})\nroot = Stack([tbl])\ntbl = Table([Col("Title", tickets.rows.title)])`,
],
toolCalls: true,
bindings: true,
editMode: true, // LLM patches only changed statements
inlineMode: true, // LLM can mix text + code in the same response
preamble: "You are a helpful support dashboard assistant.",
additionalRules: ['Use @Reset after form submit, not @Set($var, "")'],
});
export async function POST(req: Request) {
const { messages } = await req.json();
const completion = await client.chat.completions.create({
model: "gpt-4o",
stream: true,
messages: [{ role: "system", content: systemPrompt }, ...messages],
});
return new Response(completion.toReadableStream(), {
headers: { "Content-Type": "text/event-stream" },
});
}
```
### Parser APIs — `createParser` and `createStreamingParser`
Low-level parsing utilities. Both accept a JSON schema from `library.toJSONSchema()`.
```ts
import { createParser, createStreamingParser } from "@openuidev/react-lang";
// One-shot parse
const parser = createParser(library.toJSONSchema());
const result = parser.parse(`
root = Stack([header, card])
header = CardHeader("Dashboard")
card = StatCard("Revenue", "$12k")
`);
console.log(result.root); // ElementNode tree
console.log(result.meta.errors); // ValidationError[]
console.log(result.queryStatements); // Query() calls extracted from parse
// Streaming parse — push chunks as they arrive
const streamParser = createStreamingParser(library.toJSONSchema());
for await (const chunk of stream) {
const partial = streamParser.push(chunk);
renderPartialResult(partial);
}
const final = streamParser.getResult();
```
### Context hooks for component renderers
These hooks are available inside components defined with `defineComponent`. They must only be called within a component renderer function, not outside of it.
```tsx
import {
useStateField,
useTriggerAction,
useIsStreaming,
useRenderNode,
useFormValidation,
} from "@openuidev/react-lang";
// Example: a custom Select component that binds to a $variable
const FilterSelect = defineComponent({
name: "FilterSelect",
description: "Dropdown bound to a reactive $variable.",
props: z.object({
name: z.string(),
value: z.any().optional(), // receives the $variable reference at runtime
options: z.array(z.string()),
}),
component: ({ props }) => {
// useStateField binds to $variable — changes propagate to all expressions referencing it
const { value, setValue } = useStateField(props.name, props.value);
const isStreaming = useIsStreaming();
return (
setValue(e.target.value)}
disabled={isStreaming}
>
{props.options.map((opt) => (
{opt}
))}
);
},
});
```
### `tagSchemaId` — name helper schemas in prompt output
Annotates a standalone Zod schema with a readable type name so the generated system prompt shows meaningful signatures instead of `any`.
```ts
import { defineComponent, tagSchemaId } from "@openuidev/react-lang";
import { z } from "zod/v4";
// Without tagSchemaId: action?: any
// With tagSchemaId: action?: ActionExpression
const ActionExpression = z.any();
tagSchemaId(ActionExpression, "ActionExpression");
const Button = defineComponent({
name: "Button",
description: "Triggers an action on click.",
props: z.object({
label: z.string(),
action: ActionExpression.optional(),
variant: z.enum(["primary", "secondary", "danger"]).optional(),
}),
component: ({ props }) => {props.label} ,
});
```
---
## `@openuidev/react-headless` — Chat state and streaming
### `ChatProvider` — root provider for headless chat
Wraps the component tree with shared chat state. Configure either `apiUrl` (simple) or `processMessage` (full control). Optionally configure thread history via `threadApiUrl` or custom thread functions.
```tsx
import {
ChatProvider,
openAIMessageFormat,
openAIReadableStreamAdapter,
} from "@openuidev/react-headless";
export function App() {
return (
{
return fetch("/api/chat", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${getToken()}`,
},
body: JSON.stringify({
threadId,
messages: openAIMessageFormat.toApi(messages),
}),
signal: abortController.signal,
});
}}
threadApiUrl="/api/threads"
streamProtocol={openAIReadableStreamAdapter()}
messageFormat={openAIMessageFormat}
>
);
}
```
### `useThread()` — active conversation state and actions
Provides messages, send/cancel actions, and message mutation helpers. Use selectors to avoid unnecessary re-renders.
```tsx
import { useThread } from "@openuidev/react-headless";
function Composer() {
const { processMessage, cancelMessage, isRunning } = useThread();
const [input, setInput] = useState("");
return (
);
}
// Selector form — re-renders only when messages change
const messages = useThread((state) => state.messages);
```
### `useThreadList()` — thread sidebar state
Manages thread loading, pagination, selection, creation, and deletion.
```tsx
import { useThreadList } from "@openuidev/react-headless";
function ThreadSidebar() {
const {
threads,
selectedThreadId,
hasMoreThreads,
isLoadingThreads,
loadMoreThreads,
switchToNewThread,
selectThread,
deleteThread,
updateThread,
} = useThreadList();
return (
New chat
{isLoadingThreads && Loading...
}
{threads.map((thread) => (
selectThread(thread.id)}
aria-pressed={thread.id === selectedThreadId}
>
{thread.title}
deleteThread(thread.id)}>✕
))}
{hasMoreThreads && (
loadMoreThreads()} disabled={isLoadingThreads}>
Load more
)}
);
}
```
### Stream protocol adapters
Select the adapter that matches what your backend actually emits. Mix and match with message format converters independently.
```tsx
import {
openAIAdapter,
openAIReadableStreamAdapter,
openAIResponsesAdapter,
agUIAdapter,
openAIMessageFormat,
openAIConversationMessageFormat,
} from "@openuidev/react-headless";
import { FullScreen } from "@openuidev/react-ui";
// OpenAI SDK response.toReadableStream() (most common)
// Raw Chat Completions SSE (direct forwarding of data: chunks)
// OpenAI Responses API
// AG-UI protocol (no adapter or converter needed — it's the default)
```
### Artifact hooks — `useArtifact` and `useActiveArtifact`
Headless hooks for managing artifact panel state in custom layouts.
```tsx
import { useArtifact, useActiveArtifact } from "@openuidev/react-headless";
import { useId } from "react";
// Inside a defineComponent renderer — bind to a specific artifact
function MyArtifactPreview({ props }: { props: { code: string; title: string } }) {
const artifactId = useId();
const { isActive, open, close } = useArtifact(artifactId);
return (
<>
{isActive ? "Viewing" : "Open"} {props.title}
{/* ArtifactPanel portals to ArtifactPortalTarget in the layout */}
>
);
}
// In a layout component — know when any artifact is open
function Layout({ children }: { children: React.ReactNode }) {
const { isArtifactActive, closeArtifact } = useActiveArtifact();
return (
{children}
{isArtifactActive && Close panel }
);
}
```
---
## `@openuidev/react-ui` — Prebuilt layouts and component libraries
### `FullScreen` — full-page chat layout
Full-page chat with a thread sidebar. All three layouts share `ChatLayoutProps` and wrap `ChatProvider` internally.
```tsx
import { openAIMessageFormat, openAIReadableStreamAdapter } from "@openuidev/react-headless";
import { FullScreen, createTheme } from "@openuidev/react-ui";
import { openuiLibrary } from "@openuidev/react-ui/genui-lib";
export default function Page() {
return (
{
return fetch("/api/chat", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
messages: openAIMessageFormat.toApi(messages),
}),
signal: abortController.signal,
});
}}
streamProtocol={openAIReadableStreamAdapter()}
componentLibrary={openuiLibrary}
agentName="Assistant"
threadApiUrl="/api/threads"
messageFormat={openAIMessageFormat}
welcomeMessage={{
title: "Hello!",
description: "How can I help you today?",
}}
conversationStarters={{
variant: "short",
options: [
{ label: "Show sales dashboard" },
{ label: "Create a support ticket" },
],
}}
theme={{
mode: "dark",
darkTheme: createTheme({
interactiveAccentDefault: "oklch(0.72 0.18 260)",
}),
}}
/>
);
}
```
### `Copilot` — sidebar overlay layout
```tsx
import { Copilot } from "@openuidev/react-ui";
import { openuiChatLibrary } from "@openuidev/react-ui/genui-lib";
```
### `BottomTray` — floating collapsible chat widget
```tsx
import { useState } from "react";
import { BottomTray } from "@openuidev/react-ui";
function App() {
const [isOpen, setIsOpen] = useState(false);
return (
<>
setIsOpen(true)}>Open Chat
>
);
}
```
### Built-in component libraries (`genui-lib`)
Two ready-to-use libraries ship with `@openuidev/react-ui`, selectable from the `genui-lib` subpath.
```tsx
import {
// Chat-optimized: root=Card, includes FollowUpBlock, ListBlock, SectionBlock
openuiChatLibrary,
openuiChatPromptOptions,
// General-purpose: root=Stack, full layout/chart/form suite
openuiLibrary,
openuiPromptOptions,
} from "@openuidev/react-ui/genui-lib";
// Chat interface (card-based responses, follow-ups, sections)
// Standalone renderer or dashboards (Stack root, Tabs, Accordion, Charts, etc.)
```
### `Artifact()` — inline preview + side panel factory
Wraps a component with a preview/panel split. The preview renders inline in the chat message; clicking it activates the full-size panel.
```tsx
import { defineComponent } from "@openuidev/react-lang";
import { Artifact, ArtifactPortalTarget } from "@openuidev/react-ui";
import { z } from "zod";
import { Prism as SyntaxHighlighter } from "react-syntax-highlighter";
const CodeArtifact = defineComponent({
name: "CodeArtifact",
description: "Code block that expands into a full artifact panel.",
props: z.object({
language: z.string(),
title: z.string(),
code: z.string(),
}),
component: Artifact({
title: (props) => props.title,
preview: (props, { open, isActive }) => (
📄 {props.title} ({props.language})
),
panel: (props, { close }) => (
✕
{props.code}
),
}),
});
// For custom layouts, mount one ArtifactPortalTarget where panels should appear
function CustomLayout() {
return (
);
}
```
### Thread history — `threadApiUrl` contract
Pass `threadApiUrl` to enable saved thread loading. OpenUI appends its own segments to the base URL.
```tsx
// Minimal setup — implements the default URL contract
// Custom thread functions — for non-standard APIs (GraphQL, auth headers, etc.)
{
const res = await fetch(`/api/conversations?cursor=${cursor ?? ""}`);
return res.json(); // { threads: Thread[], nextCursor?: any }
}}
createThread={async (firstMessage) => {
const res = await fetch("/api/conversations", {
method: "POST",
headers: { "Content-Type": "application/json", Authorization: `Bearer ${getToken()}` },
body: JSON.stringify({ firstMessage }),
});
return res.json(); // Thread
}}
updateThread={async (thread) => {
const res = await fetch(`/api/conversations/${thread.id}`, {
method: "PUT",
headers: { "Content-Type": "application/json" },
body: JSON.stringify(thread),
});
return res.json();
}}
deleteThread={async (id) => {
await fetch(`/api/conversations/${id}`, { method: "DELETE" });
}}
loadThread={async (threadId) => {
const res = await fetch(`/api/conversations/${threadId}/messages`);
return res.json();
}}
agentName="Assistant"
/>
```
---
## OpenUI Lang — Language Reference
### Syntax overview
OpenUI Lang is a line-oriented assignment language: one statement per line. The LLM generates it; the runtime executes it.
```text
# Component statement
identifier = ComponentName(positionalArg1, positionalArg2)
# Reactive state declaration
$variableName = defaultValue
# Data statement (read)
data = Query("tool_name", {arg: $variable}, {defaultValue}, refreshIntervalSeconds?)
# Data statement (write — only runs when triggered by @Run)
result = Mutation("tool_name", {arg: $variable})
# Root entry point — always required
root = Stack([header, content])
```
### Reactive state with `$variables`
```text
# Declare variables with defaults
$days = "7"
$search = ""
$showModal = false
# Two-way binding to input components
filter = Select("days", $days, [SelectItem("7", "7 days"), SelectItem("30", "30 days")])
# Variables in expressions — auto-re-evaluates when $days changes
title = TextContent("Last " + $days + " days")
data = Query("analytics", {days: $days}, {rows: []})
# Conditional rendering
$showModal ? Modal("Edit", $showModal, [editForm]) : null
# Change state from buttons
showBtn = Button("Edit", Action([@Set($showModal, true)]))
resetBtn = Button("Clear", Action([@Reset($search, $days)]))
```
### Query and Mutation — data fetching
```text
# Query: executes on load, re-fetches when $variables in args change
tickets = Query("list_tickets", {}, {rows: []})
filtered = Query("get_tickets", {status: $status, days: $days}, {rows: []}, 30)
# Access results with dot notation and array pluck
tbl = Table([
Col("Title", tickets.rows.title),
Col("Status", tickets.rows.status),
Col("Count", "" + @Count(@Filter(tickets.rows, "status", "==", "open")))
])
# Mutation: only runs when triggered by @Run
createResult = Mutation("create_ticket", {title: $title, priority: $priority})
submitBtn = Button("Create", Action([
@Run(createResult),
@Run(tickets),
@Reset($title, $priority)
]))
# Mutation error/success feedback
createResult.status == "error" ? Callout("error", "Failed", createResult.error) : null
createResult.status == "success" ? Callout("success", "Created", "Ticket added.") : null
```
### Built-in functions
```text
# Aggregation
@Count(array) → number
@Sum(array) → number
@Avg(array) → number
@Min(array) / @Max(array) → value
# Filtering and sorting
openItems = @Filter(tickets.rows, "status", "==", "open")
sorted = @Sort(tickets.rows, "created", "desc")
# Composition (KPI card pattern)
kpi = Card([
TextContent("Open Tickets", "small"),
TextContent("" + @Count(@Filter(data.rows, "status", "==", "open")), "large-heavy")
])
# Iteration — render a template for each array element
tags = @Each(tickets.rows, "t", Tag(t.priority, null, "sm"))
# Math
@Round(@Avg(data.rows.score), 1) → 4.2
@Abs(-42) → 42
@Floor(3.9) → 3
@Ceil(3.1) → 4
# Action steps inside Action([...])
Button("Submit", Action([
@Run(mutation), # execute mutation or re-fetch query
@Set($var, "value"), # change a $variable
@Reset($a, $b), # restore $variables to defaults
@ToAssistant("msg"), # send message to LLM
@OpenUrl("url"), # open URL in new tab
]))
```
### Complete dashboard pattern
```text
$days = "7"
$search = ""
data = Query("get_usage_metrics", {days: $days}, {totalEvents: 0, data: []})
endpoints = Query("get_top_endpoints", {days: $days}, {endpoints: []})
filter = FormControl("Date Range", Select("days", [
SelectItem("7", "7 days"),
SelectItem("30", "30 days")
], null, null, $days))
kpiRow = Stack([
Card([TextContent("Events", "small"), TextContent("" + data.totalEvents, "large-heavy")]),
Card([TextContent("Avg/Day", "small"), TextContent("" + @Round(@Avg(data.data.events), 0), "large-heavy")])
], "row", "m", "stretch", "start", true)
overviewTab = TabItem("overview", "Overview", [LineChart(data.data.day, [Series("Events", data.data.events)])])
endpointsTab = TabItem("endpoints", "Endpoints", [
Table([
Col("Path", endpoints.endpoints.path),
Col("Requests", endpoints.endpoints.requests, "number"),
Col("Latency", endpoints.endpoints.avgLatency, "number")
])
])
root = Stack([CardHeader("Analytics Dashboard"), filter, kpiRow, Tabs([overviewTab, endpointsTab])])
```
### Incremental editing (`editMode`)
When `editMode: true` is set in the prompt config, the LLM emits only changed or new statements. The parser merges by statement name: same name replaces, new name is added, missing names are kept from the previous state.
```text
# Initial LLM output (full generation)
root = Stack([header, tbl])
header = CardHeader("Tickets")
tickets = Query("list_tickets", {}, {rows: []})
tbl = Table([Col("Title", tickets.rows.title)])
# User says: "add a pie chart of ticket status"
# LLM patch (only 2 lines — ~85% fewer tokens):
root = Stack([header, chart, tbl])
chart = PieChart(
["Open", "Closed"],
[@Count(@Filter(tickets.rows, "status", "==", "open")),
@Count(@Filter(tickets.rows, "status", "==", "closed"))],
"donut"
)
# header, tickets, tbl — kept from original unchanged
```
---
## Summary
OpenUI is designed for two primary integration patterns: **chat interfaces** with rich generative UI responses, and **standalone interactive dashboards** rendered entirely from LLM-generated OpenUI Lang. For chat interfaces, the typical stack is `FullScreen`/`Copilot`/`BottomTray` from `@openuidev/react-ui` backed by `openAIReadableStreamAdapter()` and `openAIMessageFormat` from `@openuidev/react-headless`, with `openuiChatLibrary` for card-based chat responses or a custom `createLibrary()` definition for domain-specific components. For dashboards and embedded widgets, the ` ` component from `@openuidev/react-lang` is used directly, with a `toolProvider` map or MCP client supplying live data through `Query()` and `Mutation()` statements in the generated OpenUI Lang.
Across both patterns, the workflow is: define components with `defineComponent` + Zod schemas, assemble a `createLibrary()` result, run `openui generate` at build time to produce a system prompt (or use `generatePrompt` from `@openuidev/lang-core` for dynamic runtime prompts), include that prompt in every LLM request, and pass `componentLibrary` (or `library`) to the rendering layer. The headless layer (`@openuidev/react-headless`) is independent of the rendering layer and can be used alone to build fully custom chat UIs with `ChatProvider`, `useThread()`, and `useThreadList()` while still benefiting from built-in stream adapters, message format converters, and thread history persistence.