A weekly newsletter workspace where the agent drafts on Friday afternoon from meeting transcripts, you review and approve in Dock, and the second script ships to Slack or email on Monday morning.
A weekly newsletter workspace where the agent drafts on Friday afternoon from meeting transcripts, you review and approve in Dock, and the second script ships to Slack or email on Monday morning.
Time30 min setup, ongoing ~10 min/week reviewDifficultybeginnerForMarketing leads, ops leads, and founders who own a weekly digest at a 5-50 person team.
How this works
Open it, hand it to your agent, walk the steps.
Paste this to your agent (Claude / Cursor / Codex)
You are the agent running on the "Weekly newsletter from transcripts" template workspace, connected via MCP at your-org/weekly-newsletter-from-transcripts.
Your job: every Friday at 4 PM, read the week's meeting transcripts, extract highlights, draft a newsletter into Sent issues, and queue a Newsletter queue row at Status=Draft. Never approve or send.
User-loop protocol:
- You propose. The operator decides. Never flip Status past Draft. Never modify a Newsletter queue row once it's beyond Draft.
- Friday 4 PM (or "draft newsletter"): pull transcripts from TRANSCRIPT_DIR or Fireflies (past 7 days, up to 20). Skip files already in processed_transcripts.json. For each new transcript, build a single combined Claude prompt that extracts: week_of, headline, summary, wins, decisions, action_items, blockers, coming_up.
- Build the newsletter markdown. Append it as a new section to Sent issues doc surface, prefixed with the week label and a DRAFT callout.
- Add a Newsletter queue row: Week, Headline, Sources (comma-separated transcript IDs), Status=Draft, Generated At=now, Sent At=blank.
- Add transcript IDs to processed_transcripts.json so the same meeting never appears in two newsletters.
- End of every working session, write 1 paragraph to Status: transcripts processed, draft posted, next due Friday.
Don't touch:
- Newsletter queue.Status (Draft / Approved / Sent is the operator's flow).
- Newsletter queue.Sent At (the send script stamps this).
- Sent issues sections from prior weeks (those are history).
First MCP tool calls:
1. list_surfaces(workspace_slug="weekly-newsletter-from-transcripts")
2. get_doc(workspace_slug="weekly-newsletter-from-transcripts", surface_slug="setup-guide")
3. get_doc(workspace_slug="weekly-newsletter-from-transcripts", surface_slug="status")
Top to bottom. Each step has tasks, pointers, gotchas.
01 / 05
Pick the transcript source
5 min
The agent reads from one of two sources: a local folder of .txt/.md/.vtt files (Zoom, Otter, Granola exports), or Fireflies via its GraphQL API. Pick one before you wire anything else. Local is simplest; Fireflies is right if your team already records there.
Tasks
Decide: local folder or Fireflies
Local: pick a path on your laptop, e.g. ~/transcripts/. Confirm your meeting tool exports plain-text transcripts there.
Fireflies: open Fireflies Settings, Integrations, API Key. Copy the key for .env later.
If you have more than 20 meetings per week, note that to the agent so it adds a date-range filter to the Fireflies query.
Gotchas
VTT files (raw WebVTT with timestamps) work, but the script strips timestamps automatically. Plain-text exports are preferred.
Fireflies fetches up to 20 transcripts per run. Teams running 30+ meetings/week need a date-range filter on the query.
02 / 05
Wire .env + the two Python scripts
15 min
The pipeline is two scripts: generate_newsletter.py drafts on Friday, send_newsletter.py ships on Monday. Both read .env, both run as standalone CueAPI handlers. Install Anthropic, requests, dotenv, drop the scripts from Setup guide, configure your environment.
Tasks
Open Setup guide (doc) and copy generate_newsletter.py + send_newsletter.py into a local folder
Run pip install anthropic requests python-dotenv
Create .env with DOCK_API_KEY, DOCK_WORKSPACE_SLUG, ANTHROPIC_API_KEY, TRANSCRIPT_DIR or FIREFLIES_API_KEY, NEWSLETTER_NAME, NEWSLETTER_STYLE=internal, DISTRIBUTION_CHANNEL=slack, SLACK_BOT_TOKEN, SLACK_CHANNEL, optional SMTP_* for email, CLAUDE_MODEL=claude-sonnet-4-6
Generate a Dock API key at trydock.ai/settings/api
Run python generate_newsletter.py once manually with a test transcript. Confirm a draft section appears in Sent issues and a row appears in Newsletter queue.
Gotchas
DOCK_WORKSPACE_SLUG must match the workspace you forked this template into. Renaming the workspace means updating the env var.
Empty transcript file or 0 new files means the script exits with a no-op. Check processed_transcripts.json if you expected a draft and got nothing.
Agent prompt for this step
Run a first draft of the newsletter. Read transcripts from the configured source (TRANSCRIPT_DIR or Fireflies, past 7 days). Skip any already in processed_transcripts.json. Extract wins, decisions, action items, blockers, coming up. Append the formatted newsletter as a new section to Sent issues doc. Add a Newsletter queue row at Status=Draft. Post a Status entry summarizing: transcripts processed, draft posted, next Friday due date.
03 / 05
Pick the distribution channel
5 min
Once you approve a draft, the send script ships it. Three options: Slack (one channel, header + headline + link back to Dock), Email (SMTP, plain text body with link), or Dock-only (the draft lives in the doc, you copy-paste manually). Slack is the default for internal team digests.
Tasks
Slack: create an incoming bot at api.slack.com, add chat:write scope, invite to channel, paste SLACK_BOT_TOKEN + SLACK_CHANNEL into .env
Email: set SMTP_HOST, SMTP_PORT, SMTP_USER, SMTP_PASSWORD, EMAIL_RECIPIENTS (comma-separated) in .env. For Gmail, enable App Passwords and use that as SMTP_PASSWORD.
Dock-only: leave DISTRIBUTION_CHANNEL=dock-only. The send script will no-op and just confirm the draft is live in the doc.
Run python send_newsletter.py once manually after approving a test draft. Confirm the message lands.
Gotchas
Gmail SMTP requires App Passwords, not your account password. The send script will error with a 535 if you skip this.
Slack bots must be explicitly invited to private channels. Public channels accept the message as soon as chat:write is granted.
04 / 05
Schedule the Friday draft + Monday send
10 min
Two cron tasks: generate weekly on Friday at 4 PM, send daily on Monday at 9 AM (or whatever fits your team's rhythm). CueAPI is the right pick if you want cloud-scheduled runs that survive your laptop closing. Cron works fine if you have an always-on machine.
Tasks
Option A, cron: crontab -e, add `0 16 * * 5 cd /path && source .env && python3 generate_newsletter.py >> generate.log 2>&1` and `0 9 * * 1 cd /path && source .env && python3 send_newsletter.py >> send.log 2>&1`
Option B, CueAPI: pip install cueapi cueapi-worker, then cueapi create --schedule '0 16 * * 5' --name 'Newsletter Generate' and cueapi create --schedule '0 9 * * 1' --name 'Newsletter Send'. Start the worker with --handler ./generate_newsletter.py and --handler ./send_newsletter.py.
Confirm the next Friday: Status has a fresh session entry, Sent issues has a new draft section, Newsletter queue has a fresh Draft row.
Gotchas
Cron requires the machine awake at the fire time. Close your laptop overnight on Thursdays? Switch to CueAPI.
CueAPI worker must be supervised (launchd / systemd) so it survives reboots. Run cueapi-worker start once + add it to your boot config.
05 / 05
Set the weekly review cadence
10 min/week ongoing
The agent surfaces drafts; you decide what ships. Friday late afternoon or Monday morning before the send fires is the right time to open Sent issues, edit the draft in place, and flip Status to Approved. The send runs at 9 AM Monday by default.
Tasks
Block 10 min on the calendar Monday morning (or Friday 5 PM if you prefer)
Open Sent issues (doc). Read the latest draft section. Edit headline, summary, wins, blockers in place.
Flip the matching Newsletter queue row Status from Draft to Approved.
Confirm Monday: row Status flips to Sent + Sent At gets stamped + the Slack/email message lands.
Gotchas
Forgetting to flip Status to Approved means the draft never ships. The send script is a no-op when nothing is Approved.
If you flip Status before Monday's send fires, the draft ships at the next run. Want to send immediately? Trigger python send_newsletter.py manually.
FAQ
Common questions on this template.
What if my team doesn't use Fireflies?
Use the local folder source instead. Set TRANSCRIPT_DIR to a path on your laptop, drop your meeting transcripts there as .txt or .md files, and the agent reads them on each Friday run. Zoom, Otter, Granola, and most transcription tools export plain text. VTT files work too; the script strips timestamps automatically.
Does the agent send the newsletter on its own?
No. The agent drafts and queues. You read, edit, and flip the queue row Status from Draft to Approved. Only then does the second script ship. The agent never touches Status past Draft. This keeps the human as the editor of record.
What if I want the newsletter to go to investors instead of the team?
Set NEWSLETTER_STYLE=external in .env. The prompt switches to a polished outward-facing tone suitable for board updates or investor digests. Everything else stays the same. You can also change EMAIL_RECIPIENTS or SLACK_CHANNEL between weeks if the audience shifts.
Can the same transcript appear in two newsletters?
No. The script writes every transcript ID to processed_transcripts.json after a successful draft. The next run skips any ID already in that file. If you want to force a re-run, delete the file (or remove specific IDs).
Does this work with Notion or Slack canvas as the source instead of meeting transcripts?
Not in v1. Sources today are the local folder (any plain-text file) or Fireflies (GraphQL). A future extension can pull from Notion pages or Slack canvas via their respective APIs. Until then, paste the relevant content into a .txt file in TRANSCRIPT_DIR before the Friday run.
Open this template as a workspace.
We mint a fresh copy in your org with the steps as table rows, the pointers as a separate table, and the brief as a doc. Bring your agents, start checking off boxes.