OSS Quick Start
Run the ggui protocol locally with the ggui CLI. This page walks you from zero to a live local MCP server and viewer in one install.
Your Agent ── MCP ──→ ggui serve (local) ──→ /s/<shortCode> viewer (local)Prerequisites
Section titled “Prerequisites”- Node.js 18+
- pnpm / npm / bun — any recent package manager.
- No account, no API key. Everything is local.
Step 1: Scaffold a project
Section titled “Step 1: Scaffold a project”npm create ggui-server@latest my-appcd my-apppnpm install # or: npm install / bun installThis drops a tiny, truthful scaffold — one agent file, one ggui.json, one package.json, plus tsconfig + .gitignore + README. See create-ggui-server for the full file list.
Step 2: Boot the server
Section titled “Step 2: Boot the server”pnpm exec ggui serveggui serve stands up the OSS server (@ggui-ai/mcp-server) on http://127.0.0.1:6781 with the first-run bundle:
- MCP endpoint:
http://127.0.0.1:6781/mcp - Session viewer:
http://127.0.0.1:6781/s/<shortCode> - Pairing + channel-3 WebSocket:
http://127.0.0.1:6781/ws - Operator console: open
http://127.0.0.1:6781/in a browser
It also supervises the agent declared at ggui.json#agent.entry (default ./agent.ts) as a sibling process.
Step 3: Point an agent runtime at the local MCP
Section titled “Step 3: Point an agent runtime at the local MCP”Any MCP-compatible agent runtime works. Drop this into .mcp.json at your project root:
{ "mcpServers": { "ggui": { "url": "http://127.0.0.1:6781/mcp", "headers": { "Authorization": "Bearer dev" } } }}Claude Desktop, Claude Code, Cursor, and any runtime that reads .mcp.json will discover the ggui_push / ggui_consume / ggui_close tools. From there you can write a system prompt that tells the LLM when to push UIs — the Examples section has working system-prompt recipes per framework.
Honest about scope today
Section titled “Honest about scope today”- ✅ Local server, viewer, cookie-authenticated WebSocket subscribe → ack all work end-to-end. Session plumbing is production-shaped, not a mock.
- ✅
ggui_pushmints shortCodes and lands on the same-origin viewer at/s/<shortCode>. - ⏳ Component-code generation is not yet wired on the OSS path.
ggui_pushreturnscodeReady: false; the viewer shows the live session shell with an empty state (“No stack items yet”) until a stack item arrives. Hosted Guuey owns generation for now — the OSS generator seam is deferred to a later release. - 🔒 Default auth is dev-mode. Any non-empty bearer token is accepted as the
builderidentity. This is only safe on127.0.0.1. Swap in a realAuthAdapterviacreateGguiServer({ auth })before exposing the server to anything beyond localhost.
What this is not
Section titled “What this is not”- Not a hosted service. No
platform.guuey.comaccount. No billing. No managed dashboards. For those, see the Hosted Quick Start. - Not a generation runtime yet. Your agent can push UIs, the viewer will show empty sessions, and the wire + session-state layer works — but the LLM-driven component generator hasn’t been ported to the OSS path. Track the OSS generator seam issue or run hosted Guuey if you need generated components today.
Repo-local fallback (before the first npm publish)
Section titled “Repo-local fallback (before the first npm publish)”Until @ggui-ai/cli + create-ggui-server land on npm, invoke from a workspace checkout:
pnpm installpnpm --filter @ggui-ai/devtool buildpnpm --filter @ggui-ai/cli buildnode packages/ggui-cli/dist/cli.js serve --mcp-onlyAfter publish, the first-run path shortens to npm create ggui-server@latest (Step 1 above).
What’s next
Section titled “What’s next”- MCP Protocol Reference — wire format and tool catalogue (same on OSS and hosted; the page shows both endpoint URLs side by side).
- Examples — MCP-config-only integrations with system-prompt recipes per framework (Claude, OpenAI, Gemini, OpenClaw, generic MCP).
- Hosted Quick Start — if you’d rather use hosted Guuey for managed generation + dashboards.
- GitHub repo — source, issue tracker, and the full monorepo walkthrough.