Skip to content

OSS Quick Start

Run the ggui protocol locally with the ggui CLI. This page walks you from zero to a live local MCP server and viewer in one install.

Your Agent ── MCP ──→ ggui serve (local) ──→ /s/<shortCode> viewer (local)
  • Node.js 18+
  • pnpm / npm / bun — any recent package manager.
  • No account, no API key. Everything is local.
Terminal window
npm create ggui-server@latest my-app
cd my-app
pnpm install # or: npm install / bun install

This drops a tiny, truthful scaffold — one agent file, one ggui.json, one package.json, plus tsconfig + .gitignore + README. See create-ggui-server for the full file list.

Terminal window
pnpm exec ggui serve

ggui serve stands up the OSS server (@ggui-ai/mcp-server) on http://127.0.0.1:6781 with the first-run bundle:

  • MCP endpoint: http://127.0.0.1:6781/mcp
  • Session viewer: http://127.0.0.1:6781/s/<shortCode>
  • Pairing + channel-3 WebSocket: http://127.0.0.1:6781/ws
  • Operator console: open http://127.0.0.1:6781/ in a browser

It also supervises the agent declared at ggui.json#agent.entry (default ./agent.ts) as a sibling process.

Step 3: Point an agent runtime at the local MCP

Section titled “Step 3: Point an agent runtime at the local MCP”

Any MCP-compatible agent runtime works. Drop this into .mcp.json at your project root:

{
"mcpServers": {
"ggui": {
"url": "http://127.0.0.1:6781/mcp",
"headers": { "Authorization": "Bearer dev" }
}
}
}

Claude Desktop, Claude Code, Cursor, and any runtime that reads .mcp.json will discover the ggui_push / ggui_consume / ggui_close tools. From there you can write a system prompt that tells the LLM when to push UIs — the Examples section has working system-prompt recipes per framework.

  • Local server, viewer, cookie-authenticated WebSocket subscribe → ack all work end-to-end. Session plumbing is production-shaped, not a mock.
  • ggui_push mints shortCodes and lands on the same-origin viewer at /s/<shortCode>.
  • Component-code generation is not yet wired on the OSS path. ggui_push returns codeReady: false; the viewer shows the live session shell with an empty state (“No stack items yet”) until a stack item arrives. Hosted Guuey owns generation for now — the OSS generator seam is deferred to a later release.
  • 🔒 Default auth is dev-mode. Any non-empty bearer token is accepted as the builder identity. This is only safe on 127.0.0.1. Swap in a real AuthAdapter via createGguiServer({ auth }) before exposing the server to anything beyond localhost.
  • Not a hosted service. No platform.guuey.com account. No billing. No managed dashboards. For those, see the Hosted Quick Start.
  • Not a generation runtime yet. Your agent can push UIs, the viewer will show empty sessions, and the wire + session-state layer works — but the LLM-driven component generator hasn’t been ported to the OSS path. Track the OSS generator seam issue or run hosted Guuey if you need generated components today.

Repo-local fallback (before the first npm publish)

Section titled “Repo-local fallback (before the first npm publish)”

Until @ggui-ai/cli + create-ggui-server land on npm, invoke from a workspace checkout:

Terminal window
pnpm install
pnpm --filter @ggui-ai/devtool build
pnpm --filter @ggui-ai/cli build
node packages/ggui-cli/dist/cli.js serve --mcp-only

After publish, the first-run path shortens to npm create ggui-server@latest (Step 1 above).

  • MCP Protocol Reference — wire format and tool catalogue (same on OSS and hosted; the page shows both endpoint URLs side by side).
  • Examples — MCP-config-only integrations with system-prompt recipes per framework (Claude, OpenAI, Gemini, OpenClaw, generic MCP).
  • Hosted Quick Start — if you’d rather use hosted Guuey for managed generation + dashboards.
  • GitHub repo — source, issue tracker, and the full monorepo walkthrough.