Dock
Sign in & remix
REMIX PREVIEWLaunch· MAY 10

Dock, inside ChatGPT: turn chats into workspaces

We shipped a curated Dock app on OpenAI's Apps SDK plus the full 48-tool MCP server. Save the chat to a doc, spin up a workspace from a prompt, drop rows into a table, all without leaving ChatGPT. Here's how the integration works, what each path is for, and the install in two clicks.

By govind· 7 min read· from trydock.ai
Dock, inside ChatGPT: turn chats into workspaces

ChatGPT is one-to-one and forgetful. You have a brilliant exchange about a launch plan, a hiring rubric, a monthly P&L, and an hour later it's somewhere in your sidebar history, three threads down, eventually pruned. Dock is the canvas every chatbot writes on. Now ChatGPT can write directly to it.

TL;DR
Two ways to connect Dock to ChatGPT. A curated 5-tool app on the OpenAI Apps SDK with inline widgets (workspace picker, doc preview, Approve / Cancel card). A full 48-tool MCP server for power users who want every Dock primitive. Both speak the same backend; pick whichever fits your usage. The install is two clicks in ChatGPT Settings → Apps. trydock.ai/chatgpt-app is the landing page; /docs/agents/native/chatgpt is the install reference.

Why this matters

The problem with conversational AI as a work surface is that conversations evaporate. You spend forty-five minutes shaping a brief in ChatGPT, then realize you have nowhere to put it. You copy-paste into a doc. You re-prompt to extract a table. You scroll back through the thread three days later looking for that one good paragraph and the search misses it.

This is the wrong abstraction for work. Real work needs a place that persists, that other people (and other agents) can read and write to, that has structure beyond a wall of text.

We've been building that place, a workspace where every agent you run is a first-class identity with its own key, scopes, and audit trail, sitting beside humans on the same docs and tables. As of today, ChatGPT is one of those agents. You talk to ChatGPT, ChatGPT writes to Dock, you keep working.

Two paths, same backend

OpenAI ships two ways to integrate. We shipped both.

Path A: the curated Dock app (5 tools, inline widgets)

The OpenAI Apps SDK lets you bundle a small MCP server with a set of inline UI widgets, actual React components that render inside the ChatGPT chat thread, not as plain text. The Dock app on the Apps SDK exposes five user-shaped tools:

  • save_chat_to_dock. Take the current ChatGPT conversation, summarize it, append it as a section to a Dock doc of your choice. Widget: a workspace picker with a preview of where the summary will land.
  • create_dock_workspace. Spin up a new workspace from a prompt. Widget: a form with name + initial markdown + default surface kind, prefilled from your prompt.
  • upsert_dock_rows. Drop one or more rows into a Dock table. Widget: a row-confirmation card showing the rows about to be created with their column mapping.
  • summarize_dock_workspace. Pull the contents of a Dock workspace into the chat thread, summarized. Widget: a quick metadata header (last updated, row count, surface kind) above the summary.
  • show_dock_workspaces. Search across your Dock workspaces, render the matches as an inline picker. Useful for the "I want to review monthly numbers, where are my finance workspaces" moment.

Every destructive operation routes through an in-chat Approve / Cancel card. You see exactly what's about to land before it lands.

The curated app is the right choice for ~90% of users. You don't need every Dock primitive, you need the half-dozen actions that map to "save what I just talked about" and "build the next thing from the prompt."

Path B: the full Dock MCP server (48 tools)

If you want every Dock primitive, bulk row operations, billing flows, webhook subscriptions, agent management, doc-format validation, point ChatGPT at the same MCP server every other client uses: trydock.ai/api/mcp. Same OAuth handshake, same workspace access rules. You get all 48 tools in your toolbelt.

Most people don't need this. The 48-tool surface is for the power user who's wiring ChatGPT into a real ops workflow, using it to manage webhooks, rotate API keys, audit recent events, do batch row operations across surfaces.

The two paths coexist cleanly. ChatGPT treats each as a separate connector in its app drawer. Many teams install the curated app for daily use and add the full MCP for occasional power workflows.

How the install works

In ChatGPT, both paths go through the same flow.

  1. Settings → Apps → Advanced settings → toggle Developer Mode on. OpenAI gates custom MCP apps behind this developer flag. It's a single toggle, no review queue.

  2. Settings → Apps → Create app. Pick a name (Dock works), paste the MCP server URL:

    • Curated app: https://chatgpt-app.trydock.ai/sse
    • Full MCP: https://trydock.ai/api/mcp Pick OAuth as the authentication method. ChatGPT auto-registers via Dock's /api/oauth/register endpoint (Dynamic Client Registration, RFC 7591), opens a browser tab, you sign into Dock, you grant the agent access. Done.
  3. Use it. Open a chat, type something like "Pull up my finance workspaces, I want to review this month's numbers." ChatGPT calls show_dock_workspaces against the search query "finance" and renders the three matching workspaces as an inline card. Click one, you're in Dock with the right workspace open.

Two minutes, three clicks. The trydock.ai/chatgpt-app landing page has a visual walk-through if you'd rather see the screens than read the steps.

The auth model: per-user, not per-service

Every ChatGPT user who installs the Dock app signs into Dock with their own identity. The agent acts as them, not as a shared service account. This matters because:

  • Every action is attributed to that user in the workspace's audit trail.
  • The user's existing workspace permissions apply, they can only see workspaces they're a member of, can only write to workspaces where they have editor access.
  • Removing the user from a workspace immediately cuts off the agent's access to that workspace too. The agent doesn't survive a permission change.

Destructive operations (delete workspace, change member roles, downgrade plan) go through Dock's two-call confirmation pattern. The agent's first call returns a confirm_token plus a human-readable summary; the agent surfaces it in chat; the user confirms; the agent re-calls with the token. On the curated app, this surfaces as the Approve / Cancel card. On the full MCP, the agent quotes the summary inline and waits for the user's "yes."

Why we built it this way

The Apps SDK is new (pre-1.0 as of this writing) and the temptation when shipping on a new platform is to ship a 1:1 mirror of whatever you already have. Dock's MCP has 48 tools. Putting all 48 on the Apps SDK would have been the lazy move, and the wrong one, because ChatGPT users aren't agents writing scripts. They're humans having a conversation, and the right surface for a human is a handful of high-leverage actions with rich inline confirmation, not a kitchen sink of primitives.

So we curated. Five tools, each with a custom widget, each with the confirm card where it matters. The full MCP stays available for the power user who wants the kitchen sink, they install it as a second connector and use it when they need it.

The other reason: ChatGPT's app drawer is a discovery surface. People scroll through it looking for what does this app do, not give me every primitive your API exposes. Five sharp tools telegraph the use case. Forty-eight don't.

What's next

Two threads we're pulling on:

  • Memory across chats. Right now each ChatGPT thread is its own session and the Dock app starts fresh in each one. We're prototyping a thin "active workspace" memory so a follow-up chat picks up where the last one left off without re-prompting.
  • OpenAI app directory submission. The Dock app is currently a custom-MCP install via Developer Mode. When OpenAI opens the public app directory to general availability, the curated app submission goes in immediately, at which point any ChatGPT user (not just Developer Mode users) can install it in two clicks from the directory.

The full code for the curated app is open, /apps/dock-chatgpt-app/ in the Dock monorepo. Worth reading if you're building your own Apps SDK integration: the widget plumbing (envelope shape, MCP resources channel, openai/widgetAccessible flag, triple-channel widget data subscription) is finicky and the docs don't cover the whole story.

If you're on ChatGPT Plus, Team, or Enterprise: trydock.ai/chatgpt-app → install → save the next conversation. If you're on the free tier, start with Dock direct, the workspace is free and you can connect any MCP client you like.

ChatGPT chats. Dock keeps.

Remix this into Dock

Make this yours. Edit, extend, run agents on it.

Sign in (free, 20 workspaces) — Dock mints a copy of this in your own workspace. The original stays untouched.

No Dock account? Sign-in is signup. Magic-link in 30 seconds.