Chat with Your Own Storage
You own the UI, you own the storage. ggui gives you the streaming protocol
(useInvoke) and a handful of pure helpers for persistence shape
(@ggui-ai/react/chat-helpers). Everything else — where messages live, how
threads are indexed, what your composer looks like — is yours.
When to use this pattern
Section titled “When to use this pattern”Use the pattern on this page when at least one of these is true:
- You already have a persistence layer (Postgres, Firestore, IndexedDB, your Redux store, anything) and don’t want a second one.
- You need complete control over the message schema — custom attachments, per-message ACLs, server-side fan-out, etc.
- You’re integrating ggui into an existing chat experience, not building a new one.
If none of those apply and you just want “a chat UI that works”, wait for
useChatThread in @ggui-ai/react/chat-thread — it wraps
this same flow behind a single hook with a pluggable storage adapter.
60-line example
Section titled “60-line example”The file below is the complete working example. The entire integration — streaming, persistence, card rendering, send — is ~60 lines.
import { useEffect } from "react";import { GguiProvider, useInvoke } from "@ggui-ai/react";import { useRafThrottled, invokeMessageToContentGroups, extractStackItemFromToolResult, type ContentGroup,} from "@ggui-ai/react/chat-helpers";
// Replace with any storage: localStorage, fetch, IndexedDB, Firestore, …const store: Array<{ threadId: string; group: ContentGroup }> = [];
function persist(threadId: string, messages: ReturnType<typeof useInvoke>["messages"]) { const seen = new Set(store.filter((e) => e.threadId === threadId).map((e) => e.group.key)); for (const msg of messages) { for (const group of invokeMessageToContentGroups(msg)) { if (!seen.has(group.key)) store.push({ threadId, group }); } }}
function Chat({ threadId, endpointUrl }: { threadId: string; endpointUrl: string }) { const { messages, send, isStreaming } = useInvoke({ endpointUrl }); const throttled = useRafThrottled(messages);
useEffect(() => { persist(threadId, messages); }, [threadId, messages]);
return ( <div> {throttled.map((m) => ( <div key={m.id}> <strong>{m.role}: </strong> {m.content.map((b, i) => { if (b.type === "text") return <p key={i}>{b.text}</p>; if (b.type === "tool_result") { const item = extractStackItemFromToolResult(b); return <pre key={i}>{JSON.stringify(item, null, 2)}</pre>; } return null; })} </div> ))} <button disabled={isStreaming} onClick={() => send("hello", { clientMessageId: crypto.randomUUID() })} > Send hello </button> </div> );}
export default function App() { return ( <GguiProvider appId="demo"> <Chat threadId="t1" endpointUrl="https://example-agent.dev" /> </GguiProvider> );}This page’s code snippets are compiled against the workspace
@ggui-ai/react on every CI run — if this page builds, the pattern works.
The ContentGroup contract
Section titled “The ContentGroup contract”invokeMessageToContentGroups(message) splits a finalized invoke message
into one or more ContentGroups — the durable unit you persist:
interface ContentGroup { key: string; // `${message.id}-${startBlockIdx}` — see invariant below kind: "text" | "card" | "other"; authorRole: "user" | "agent"; blocks: ContentBlock[]; // a contiguous run of text, or a tool_use + tool_result pair cardSnapshot: unknown | null; // frozen StackItem for kind="card" textPreview: string; // ~160-char preview for chat-list tiles}The key invariant. key is deterministic from message.id and the
block index where the group starts. That has two consequences you should
design against:
- Idempotency. Re-persisting the same group with the same key is a
no-op. If your storage uses
keyas the primary key (recommended), thepersist()loop above can run after every token delta without creating duplicates. - Streaming messages are excluded. A message whose
isStreamingistruereturns[]— you get groups only when the turn has finalized. That’s whypersist()doesn’t need a separate “on end_turn” callback.
Reloading a thread
Section titled “Reloading a thread”On thread reopen, rebuild the ConversationMessage[] your store remembers
and seed the hook via initialMessages. contentGroupsToConversationMessages
collapses groups sharing the same message.id prefix back into one message:
import { contentGroupsToConversationMessages } from "@ggui-ai/react/chat-helpers";
function useSeededInvoke(threadId: string, endpointUrl: string) { // Resolve before render — useInvoke captures initialMessages on mount. const groups = store.filter((e) => e.threadId === threadId).map((e) => e.group); const seed = contentGroupsToConversationMessages(groups); return useInvoke({ endpointUrl, initialMessages: seed });}initialMessages is a seed on mount. Changing it on a re-render does
not reset the hook’s state — that’s intentional; the hook owns the
conversation after mount. If the user switches threads, unmount the
<Chat> subtree (change key={threadId}) and let the new instance seed
from the new thread’s store.
Why send({ clientMessageId }) matters
Section titled “Why send({ clientMessageId }) matters”useInvoke accepts an optional clientMessageId the caller controls:
send("hello", { clientMessageId: crypto.randomUUID() });The rendered user message’s id becomes that value, which means:
- Retry without duplicates. If the network fails and you retry the same
send, your storage sees the same
clientMessageId→ the sameContentGroup.key→ the outbox is idempotent by construction. - Cross-device continuity. If you persist user messages optimistically
on the sending device and the agent turn later replays on another
device, the same
clientMessageIdcollapses them into one thread entry.
Without clientMessageId, useInvoke falls back to a random
user_<hex> id — fine for ephemeral chats, wrong for anything durable.
What you still have to do
Section titled “What you still have to do”The helpers stop at shape. You still own:
- Transport to storage.
store.push(...)above is a JavaScript array for brevity. In real integrations replace it with afetch('/persist'), an IndexedDB write, a Firestore batch, … - Thread indexing.
ContentGroup.textPreviewis the building block for chat-list tiles; wiring it into a thread list is up to you. - Reconnect + resume.
useInvokedoes not yet replay an interrupted stream. If the page reloads mid-turn, the in-flight assistant message is lost — only finalized groups persist. That’s a deliberate boundary for segment 3; segment 2 (useChatThread) will close it.
When those boundaries start to hurt, useChatThread is the next step up.
It wraps the same primitives with a storage adapter interface and a
ChatThreadProvider — you bring a MessageStorageAdapter, it brings the
outbox, the seed-on-reopen wiring, and the optimistic-send UX.