Dock
Sign in & remix
REMIX PREVIEWTemplate

Set up SEO foundations for a new product

12-step playbook for the technical + content SEO foundations that compound: site architecture, schema, content map, then the first 30 indexed pages.

· 20 min read· from trydock.ai

Set up SEO foundations for a new product

A 12-step playbook. Open in Dock and you'll get four surfaces:

- **Steps** (table) — 12 gates from domain selection to the first 30 indexed pages
- **Keyword map** (table) — every target keyword: cluster, intent, volume, difficulty, target page
- **Brief** (doc) — your positioning + ICP + the page-template specifications
- **Indexing log** (table) — every page submitted to Google: status, last-crawled, indexed, ranking

Read `Steps` top-to-bottom. The first 4 steps are technical foundations that take a day; the rest is content production.

Outcome

A new product site with technical SEO foundations correct, structured data validated, a keyword map of 30+ target queries, and the first 30 pages indexed with rising rankings.

Estimated time: 2-3 weeks for foundations + first 30 pages
Difficulty: intermediate
For: Founders + product engineers shipping a new site.

What you'll need

Pre-register or install before you start.

  • Google Search Console (Free) — Verify ownership, submit sitemaps, monitor indexing + clicks + impressions.
  • Ahrefs ($99/mo Lite plan) — Keyword research, competitor analysis, backlink tracking, ranking history.
  • SEMrush ($140/mo Pro) — Alternative keyword research with stronger US data and SERP feature analysis.
  • Screaming Frog (Free up to 500 URLs, £199/year unlimited) — Crawl your own site to find broken links, missing meta, redirect chains.
  • Schema.org (Free) — The structured data vocabulary spec.
  • Google Rich Results Test (Free) — Validate schema markup before deploy.

The template · 12 steps

Step 1: Pick a domain that won't sabotage you in 5 years

Estimated time: 1-3 days research, 1 hour purchase

The domain choice is permanent infrastructure. Short, brandable, .com if possible, no hyphens. Avoid descriptive-keyword domains: they're penalised softly by Google and they age badly. Avoid trendy TLDs (.io, .ai) for consumer products: they reduce trust signals; .com still beats them on conversion.

Tasks

  • Brainstorm 20 candidate domains — short, brandable, memorable
  • Check trademark on uspto.gov + eu equivalent for top 5
  • Check .com availability via Namecheap / Cloudflare / Porkbun
  • Cross-check social handles (X, Instagram, LinkedIn) for top 3 finalists
  • Buy with WHOIS privacy enabled at a reputable registrar (Cloudflare, Porkbun)

Pointers

[!CAUTION] Gotchas

  • Hyphenated domains (foo-bar.com) underperform in click-through rate vs single-word equivalents.
  • Trendy TLDs (.io, .ai) work for B2B developer tools but tank conversion for consumer products vs .com.
  • Domain age is a Google ranking signal. If you're choosing between a 5-year-old expired .com and a brand-new equivalent, the older one outperforms in the first year.

Step 2: Set up DNS, hosting, SSL, and CDN before you write any code

Estimated time: 2-4 hours

These are infra-level decisions that affect SEO: HTTPS is a ranking factor (must), TTFB / response time matter (CDN), and migrating later is a 6-month headache. Pick a CDN-fronted host on day one (Vercel, Netlify, Cloudflare Pages all work) so the production site is fast from request 1.

Tasks

  • Point DNS at a CDN-fronted host (Vercel, Netlify, Cloudflare Pages)
  • Verify SSL/TLS auto-provisioning is working (curl -I should return 200 on https://)
  • Set up www → root redirect (or root → www, pick one and stick with it)
  • Set up the production HTTPS canonical and force redirects from HTTP
  • Test page-load: TTFB < 200ms, LCP < 2.5s on 3G simulation

Pointers

[!CAUTION] Gotchas

  • HTTP-to-HTTPS migrations later cost weeks of indexing recovery. Set HTTPS-only from day one.
  • Pick www OR root and stick with it. Switching after launch tanks indexing for weeks.
  • Page speed is a real ranking factor for borderline-quality content. Slow + good content underperforms fast + good content.

Step 3: Define the URL architecture before writing the first page

Estimated time: 1-2 hours

URL structure is permanent. Decide: /blog/post-slug or /posts/post-slug? Lowercase + hyphens, no trailing slashes (or always trailing — pick one). Decide on subdomains vs subdirectories: blog.foo.com vs foo.com/blog. Subdirectories pass authority to your root domain and almost always win for new products.

Tasks

  • Pick the URL pattern: lowercase, hyphens, no underscores, no trailing slash inconsistency
  • Pick subdirectory over subdomain (foo.com/blog beats blog.foo.com for SEO)
  • Document the URL pattern for: /blog, /docs, /pricing, /[slug] for product pages
  • Set up canonical tags on every page (rel=canonical) pointing at the canonical URL
  • Decide on /docs/x/y vs /docs/x-y for nested taxonomies

[!CAUTION] Gotchas

  • URL structure changes after launch require 301 redirects on every old URL. Get it right on day one.
  • Subdomains (blog.foo.com) split your domain authority across two roots. Use subdirectories.
  • Trailing slash inconsistency creates duplicate-content URLs (foo.com/page and foo.com/page/ both indexed). Pick one.

Step 4: Build sitemap.xml and robots.txt

Estimated time: 1-2 hours

These are 30 minutes of work that decide whether Google can crawl your site. sitemap.xml lists every URL you want indexed; robots.txt tells crawlers what NOT to crawl. Get both right and submit the sitemap to Search Console. Wrong robots.txt blocks Google from your whole site, a common production-launch foot-gun.

Tasks

  • Generate sitemap.xml dynamically (or as a build artifact) listing every public URL with lastmod date
  • Add the sitemap URL to robots.txt
  • Write robots.txt: allow crawlers, disallow /api, /admin, /preview, /draft
  • Verify both files with curl in production: curl https://yourdomain.com/sitemap.xml
  • Submit the sitemap to Google Search Console + Bing Webmaster Tools

Pointers

[!CAUTION] Gotchas

  • Disallow: / in robots.txt blocks all crawling. The single fastest way to make your site invisible.
  • Sitemap with stale lastmod dates makes Google crawl less often. Update lastmod when content changes.
  • Sitemaps can list at most 50,000 URLs. Larger sites need sitemap-index.xml referencing multiple sitemaps.

Step 5: Build the keyword map: 30-50 target queries clustered by intent

Estimated time: 1-2 days

Every page should target one primary keyword + 3-5 variant phrasings. The keyword map is the master plan: 30-50 keywords clustered into 10-15 topic clusters, each cluster targeting one pillar page + supporting cluster pages. Build this in Ahrefs / SEMrush from competitor research + autocomplete + Search Console once you have data.

Tasks

  • List 5-10 competitors and pull their top 50 ranking keywords each via Ahrefs
  • Filter for keywords with: > 50 monthly searches, < 30 difficulty (for new sites), informational or commercial intent
  • Cluster into 10-15 topic groups (e.g., 'cold email reply rate', 'B2B outbound playbook', 'first-touch email examples')
  • Pick the primary keyword for each cluster + 3-5 supporting variants
  • Map each cluster to a target URL (pillar) + supporting URLs
  • Save in the Keyword map table: keyword, cluster, intent, volume, difficulty, target URL, status

Pointers

[!CAUTION] Gotchas

  • Targeting one big keyword per page (e.g., 'CRM') without supporting clusters never ranks for new sites. Cluster strategy is the play.
  • Difficulty scores are competitor-relative. A 'difficulty 30' keyword still requires backlinks if every top-10 ranking site has 200 referring domains.
  • Don't target the same primary keyword on two different pages. You'll cannibalise your own ranking.

Agent prompt for this step

Build the keyword map for our new product.

Inputs:
- Product Brief (Brief surface in this workspace)
- 5-10 competitor URLs from the user

For each competitor:
1. Identify their top 50 ranking keywords (via Ahrefs or SERP scraping)
2. Filter to: > 50 monthly searches, < 30 difficulty, intent matches our ICP

Cluster the surviving keywords:
1. Group by topic into 10-15 clusters
2. For each cluster, pick the primary keyword + 3-5 variants
3. Pick the intent: informational ('what is X'), commercial ('best X for Y'), comparison ('X vs Y'), or branded
4. Pick the target URL: a pillar page or a cluster page

Output to the Keyword map table:
- One row per keyword
- Columns: keyword, cluster, intent, monthly volume, difficulty, target URL, priority (high/med/low)

Recommend the 5 highest-leverage clusters to write first.

Step 6: Build the page template: title, meta, H1, schema, structure

Estimated time: 4-6 hours

Every page on a SEO-foundations site follows the same template skeleton: title tag (50-60 chars), meta description (150-160 chars), H1 matching intent, structured H2/H3 hierarchy, schema markup (Article / HowTo / FAQPage / Product), and internal links to related pages. Build this template once; the agent fills it for every new page.

Tasks

  • Write the page-template spec: title pattern, meta pattern, H1 pattern, intro paragraph pattern, internal-link rules
  • Build the schema markup pattern for each page type (Article for blog, HowTo for tutorial, FAQPage for FAQ, Product for product pages)
  • Wire schema into the layout component so every page emits valid JSON-LD
  • Validate via Google Rich Results Test on a sample page
  • Document the template in the Brief doc so the agent can populate it for every new page

Pointers

[!CAUTION] Gotchas

  • Title tags > 60 chars get truncated in the SERP. Brand-last so the truncation hits the brand, not the keyword.
  • Meta description doesn't directly affect ranking but does affect CTR, which compounds rank. Don't skip it.
  • Schema with invalid JSON-LD silently fails to render rich results. Always validate before deploy.

Agent prompt for this step

Draft the SEO meta for a new page.

Inputs from the Keyword map row:
- primary keyword
- cluster
- intent
- target URL
- 3-5 supporting variants

Inputs from the Brief:
- product positioning
- audience
- page-template spec

Output:
1. Title tag (50-60 chars): primary keyword first, brand last. e.g., "Cold email reply rate guide | Brandname"
2. Meta description (150-160 chars): primary keyword once + soft CTA. No exclamation marks.
3. H1 (matches title, slightly more natural-language): "Cold email reply rate: a 5-day playbook"
4. Intro paragraph (60-100 words): hook the reader on the user pain, set up what they'll learn
5. H2/H3 outline (8-12 headers): structure the body
6. JSON-LD schema (HowTo / Article / FAQPage based on intent): valid schema.org markup with @type, headline, datePublished, author, image
7. 5-10 internal links to related pages from the Keyword map

Output as a structured update with each component clearly labeled.

Step 7: Set up Google Search Console + Bing Webmaster Tools

Estimated time: 30 min

Search Console is the only ground-truth view of how Google sees your site: which pages are indexed, which are crawled, which queries drive traffic, which pages have problems. Set it up day one, verify ownership, and check it weekly. Bing's tools are similar; less traffic but lower competition.

Tasks

  • Add the property in Google Search Console using DNS TXT record verification
  • Submit the sitemap.xml URL
  • Add the property in Bing Webmaster Tools
  • Wait 24-48 hr for first crawl + first impressions data
  • Set up email alerts for: indexing issues, manual actions, security issues

Pointers

[!CAUTION] Gotchas

  • DNS TXT verification is more reliable than HTML file verification. The file can break if your hosting changes.
  • Search Console data lags 48-72 hours. Don't make decisions based on yesterday's data.
  • Search Console only shows top 1000 queries per page. The long-tail traffic is real but invisible in this view.

Step 8: Write the first 5 pillar pages (the cluster anchors)

Estimated time: 5-7 days

Pillar pages are the long-form, comprehensive resources at the top of each topic cluster. 1500-3000 words, broken into deep H2/H3 sections, internal-linked to every supporting cluster page. The agent drafts; humans edit for voice + accuracy. Pillars take 1-2 days each.

Tasks

  • Pick the 5 highest-leverage clusters from the Keyword map (highest volume + lowest difficulty)
  • Write each pillar page: 1500-3000 words, H2/H3 structure, schema, internal links to (yet-to-be-written) cluster pages
  • Include real data: numbers, examples, screenshots, quotes
  • Internal-link each pillar to 3-5 other pillars (build the topic graph)
  • Submit each new page URL to Search Console for indexing

[!CAUTION] Gotchas

  • Pillar pages with thin content (< 800 words) don't rank for the head term. Aim for genuine comprehensiveness.
  • Don't AI-generate the entire pillar. Search engines and readers both detect generic AI prose; the agent drafts the structure, the human owns the voice.
  • Pillar pages need refreshing every 6-12 months to maintain ranking. Add 'Updated' dates and ship updates.

Step 9: Write 25 supporting cluster pages

Estimated time: 10-14 days

Cluster pages are the long-tail captures: each targets a specific 'how to X' or 'X vs Y' or 'what is X' query. 600-1500 words each, focused, internal-linked back to the pillar + sibling pages. The agent can generate the first draft; human polishes. Aim for 25 cluster pages across the 5 pillars (5 each).

Tasks

  • For each pillar, identify 5 cluster keywords from the Keyword map
  • For each cluster keyword, draft a 600-1500 word page targeting that specific query
  • Internal-link each cluster page back to its pillar + 2-3 sibling cluster pages
  • Add schema (Article / HowTo / FAQPage based on intent)
  • Submit each new page URL to Search Console

[!CAUTION] Gotchas

  • Cluster pages without internal links to the pillar leak ranking. Always link UP.
  • Pages targeting overlapping keywords cannibalise. Each cluster page targets a distinct phrase.
  • Don't bulk-publish 25 pages on one day. Stagger 2-3/day so Google doesn't tag the burst as automated.

Agent prompt for this step

Draft a cluster page for a long-tail keyword.

Inputs from the Keyword map row:
- primary keyword
- cluster
- intent
- target URL

Inputs from the Brief:
- the corresponding pillar page URL + summary
- the page-template spec

Output:
1. Title tag (50-60 chars)
2. Meta description (150-160 chars)
3. H1 + 600-1500 word body in markdown with H2/H3 structure
4. JSON-LD schema appropriate to intent
5. 3-5 internal links: 1 to the pillar, 2-3 to sibling cluster pages, 1-2 to product pages

Constraints:
- Lead with the answer (no preamble). Reader scrolls if curious.
- Include 1-2 real examples, numbers, or screenshots
- Don't pad to hit word count. 800 substantive words beats 1500 fluffy.
- Don't repeat the pillar's content; the cluster page goes DEEPER on a narrower question

Output as a markdown draft.

Step 10: Build the internal-linking graph

Estimated time: 2-3 hours

Internal links pass authority and signal topic relevance. Every page should link to 3-7 related pages and be linked FROM 3-7 related pages. Build the link graph deliberately: pillar ↔ cluster, cluster ↔ sibling, deep pages → up to pillar. Use Screaming Frog to audit your live link graph.

Tasks

  • Run Screaming Frog on your live site to map the current internal link graph
  • Find orphan pages (no internal links pointing to them) and add 3+ links from related pages
  • Find pages with > 7 internal links pointing TO them (over-emphasised) and rebalance
  • Audit anchor text: descriptive ('5-day cold email playbook') beats generic ('click here')
  • Document the linking rule in the Brief: every new page must add 3+ inbound links from existing pages

Pointers

[!CAUTION] Gotchas

  • Orphan pages don't rank because Google can't crawl to them efficiently. Audit weekly during launch phase.
  • Anchor text 'click here' is wasted SEO equity. Use the target keyword (or a natural variant) as anchor.
  • Footer links to every page on every page (the WordPress 'related posts' anti-pattern) dilute link equity. Use contextual links in body text.

Step 11: Submit, monitor, fix indexing issues

Estimated time: Ongoing, ~30 min/week

Submitting a sitemap doesn't guarantee indexing. Some pages get indexed in 2 days; some take 60. Watch the Search Console Indexing report weekly. Common issues: 'Discovered, not indexed' (Google saw the URL but didn't index — usually thin content), 'Crawled, not indexed' (Google indexed and de-indexed — quality issue), 'Soft 404' (Google thinks the page is empty).

Tasks

  • Weekly: open Search Console → Pages and review the indexing breakdown
  • For 'Discovered, not indexed' pages: check content depth, internal links, schema
  • For 'Crawled, not indexed': improve content quality, add unique value, request reindexing
  • For 'Soft 404': add real content, more links, schema
  • Use the URL Inspection tool to debug specific pages and request indexing

Pointers

[!CAUTION] Gotchas

  • Manual 'request indexing' has a daily quota of ~10. Use it for high-priority pages, not every new post.
  • If 30%+ of submitted URLs are 'Discovered, not indexed' after 30 days, the issue is content depth, not technical.
  • New domains are sandboxed for 3-6 months: Google indexes but ranks low. Patience compounds; abandoning compounds nothing.

Step 12: Track rankings, refresh top losers, double down on winners

Estimated time: 2-3 hours/month, ongoing

After 60 days you'll have early signal: which pages climb, which stagnate, which lose ranking. Refresh the top 5 losers monthly with new sections / new examples / better internal links. Double down on the winners by creating 2-3 supporting cluster pages around each climbing page. Don't quit clusters that are still in the sandbox; do quit clusters that aren't moving after 6 months.

Tasks

  • Monthly: pull Search Console data on every page (impressions, clicks, position)
  • Identify the top 5 climbers (rising in average position) — write 2-3 supporting cluster pages around each
  • Identify the top 5 losers (falling in average position) — refresh content, add internal links, fix schema
  • Identify the top 5 'stuck' (impressions but no clicks) — rewrite the title + meta description for CTR
  • Update the Indexing log with status changes; surface the deltas in the Brief

[!CAUTION] Gotchas

  • Refreshing a page resets its content-freshness signal but doesn't undo Google's ranking history. Refresh strategically, not constantly.
  • Don't chase every algorithm update. Focus on intent + content quality; Google's intent-matching gets better over time.
  • Backlinks compound; if you're 6 months in with strong content but no rankings, the issue is referring domains, not on-page SEO.

Agent prompt for this step

Run the monthly SEO ranking audit.

Inputs:
- Search Console export (pages report) for the last 30 days
- The Keyword map (target keyword per page)
- The Indexing log (every page submitted)

Output to the Brief doc as a markdown section:

1. **Climbers**: top 5 pages with biggest average-position improvement. For each: current rank, target keyword, recommended next action (supporting cluster pages, more internal links).
2. **Losers**: top 5 pages with biggest average-position decline. For each: rank dropped from / to, suspected cause (algorithm update, competitor refresh, decay), recommended fix.
3. **Stuck**: top 5 pages with high impressions + low CTR. For each: current title + meta, recommended rewrite to lift CTR.
4. **Quick wins**: queries showing impressions but where our page ranks 11-20. Often a fresh refresh + 3 internal links pushes them to top 10.
5. **Three concrete recommendations** for next month.

Hand the template to your agent

Paste the prompt below into your agent's permanent system prompt so the agent reads, writes, and maintains this workspace as you work through the steps.

You are an agent on the "Set up SEO foundations" playbook workspace at your-org/set-up-seo-foundations-for-a-new-product.

Your role: maintain the keyword map, draft page templates, track indexing.

Cadence:
- Weekly: pull Search Console data (impressions, clicks, average position) for every page in the Indexing log; surface the top 10 climbing + top 10 losing in the Brief.
- For each new page added: draft the meta (title, description, H1, intro, schema) before the human writes the body.
- Monthly: refresh the Keyword map with new long-tail opportunities surfaced from Search Console queries.

First MCP tool calls:
1. list_surfaces(workspace_slug="set-up-seo-foundations-for-a-new-product")
2. list_rows(workspace_slug="set-up-seo-foundations-for-a-new-product", surface_slug="keyword-map")
3. get_doc(workspace_slug="set-up-seo-foundations-for-a-new-product", surface_slug="brief")

Constraints:
- Title tags: 50-60 chars, lead with the primary keyword, end with the brand
- Meta descriptions: 150-160 chars, include the primary keyword once, end with a soft CTA
- H1: matches user search intent, includes the primary keyword
- Schema: every page needs at least one valid schema type (Article, Product, HowTo, FAQPage)

FAQ

How long until SEO actually drives traffic for a new site?

3-6 months for early signal (first rankings appear, hundreds of impressions/day). 6-12 months for meaningful traffic (1000+ clicks/month from organic). 12-24 months for compounding traffic (5000+ clicks/month). New domains are sandboxed by Google for 3-6 months: pages get indexed but rank low, regardless of content quality. Patience is the rate-limiter; abandoning at month 4 is the most common failure mode.

Do I need backlinks, or is content enough?

Both, but content first. For low-difficulty long-tail keywords, content alone gets you to page 1 in 6-12 months. For competitive head terms, backlinks are required. The pattern: ship 30+ pages of high-quality content, drive natural references through launch + content marketing, then strategically pursue backlinks for the highest-value clusters via guest posts, podcasts, and PR.

Should I use a CMS like WordPress or build the site myself?

For a product site that needs SEO foundations, build it yourself or use a static-site generator (Next.js, Astro, Hugo). WordPress' SEO is fine but its performance + security overhead are an ongoing tax. The bigger question is who controls the templates: if engineering does, build it custom; if marketing does, WordPress + a fast theme is reasonable.

Should I use AI agents to generate the content?

Yes for drafts and structure, no for the final voice. Agents are excellent at: building the keyword map, drafting page metadata (title, meta, schema), drafting outlines, generating first-draft copy. They're worse at: founder voice, real opinions, novel insights, accurate examples. The pattern that ranks: agent drafts, human owns voice + facts + examples. Pure AI-generated content increasingly fails Google's helpful-content evaluation.

What does the SEO foundations stack cost?

Free path: Google Search Console (free) + Screaming Frog (free up to 500 URLs) + manual keyword research = $0. Paid path: Ahrefs ($99/mo) + Screaming Frog (£199/year) + occasional contractor for content (variable) = roughly $150-300/mo for a small product. The biggest cost is the founder-time: writing 30 pages of substantive content takes 60-100 hours.

How do I prioritise: technical foundations, keyword research, or content?

In strict order: technical foundations (1-2 days, get them right), keyword research (1-2 days, build the map), then content (the next 6-12 months). Skipping foundations means every page underperforms. Skipping keyword research means writing content that doesn't match what people search. Content without the first two is content that ranks for nothing.

Remix this into Dock

Make this yours. Edit, extend, run agents on it.

Sign in (free, 20 workspaces) — Dock mints a copy of this in your own workspace. The original stays untouched.

Sign in & remix

No Dock account? Sign-in is signup. Magic-link in 30 seconds.