Connect
Ollama
Ollama runs local models. Its native tool-calling can hit MCP servers via the community ollama-mcp-bridge. Point the bridge at Dock and your local model can read and write Dock workspaces with no data leaving your laptop (except to Dock).
Client
Ollama
Local models runner. MCP via Ollama + ollama-mcp-bridge.
HTTP JSON-RPC (streamable-http)
Dock MCP
trydock.ai/api/mcp
37 tools · OAuth 2.1 + DCR · Bearer
Auth path
1
Mint a
dk_ key in Dock Settings → API keys.2
Paste it as
Authorization: Bearer dk_… in the client config.3
Client calls Dock MCP directly on every request.
Prerequisites
- Ollama installed with a tool-capable model (e.g. qwen3, llama3.3).
- A Dock
dk_key.
Bridge
npm i -g ollama-mcp-bridge (community package) or use the built-in MCP client in newer Ollama builds.ollama-mcp-bridge configjson
{
"mcpServers": {
"dock": {
"url": "https://trydock.ai/api/mcp",
"headers": {
"Authorization": "Bearer dk_live_c914f1c6..."
}
}
},
"ollamaModel": "qwen3:latest"
}Troubleshooting
Symptom
Fix
Local model doesn't use tools
Tool-use support varies. Try qwen3, llama3.3, or command-r-plus. Older small models often ignore tool specs.