April 12, 2026 SEO

How I Scored 436 Pages for a Domain Migration

A weighted scoring model for prioritizing pages during a domain migration, with the exact formula used on the LumApps and Teach on Mars consolidation.

Most domain migrations lose traffic because teams redirect the wrong pages. They preserve the old URL structure, assume "301 everything," and watch organic search collapse two weeks later when Google reprocesses the signals. The real unlock isn't the redirect map — it's the scoring model that decides which pages belong on the new site in the first place.

When I inherited the Teach on Mars properties during the LumApps consolidation — three acquired brands, one target domain, several thousand URLs — we needed a defensible way to tier each URL into migrate, review, or redirect-only. Here is the weighted scoring formula I built.

The failure mode: 1:1 redirects

The default migration plan looks like this: take every URL on the old site, map it to an equivalent URL on the new site, ship 301s, move on. It is fast and auditors like it because the row count matches. It is also how you lose 40% of your organic traffic in the first month post-cutover.

Three things go wrong:

  • Zombie URLs migrate alongside live ones. Pages with zero clicks in 12 months, pages no longer linked from any navigation, pages that were A/B tested once and forgotten — they all get 301'd. Each one dilutes crawl budget.
  • Near-duplicates collapse the wrong way. When two old URLs both map to one new URL, Google consolidates signals — sometimes toward the weaker page. The strong page gets folded into the weak one's reputation.
  • Intent drift gets preserved. A 2021 blog post optimized for a deprecated product keyword isn't worth migrating. It's worth redirecting to a category page and letting the topical authority re-anchor somewhere useful.

A scoring model fixes all three because it forces you to measure pages before deciding what to do with them.

The inputs

For each URL on the source domain, I pull four metrics from Google Search Console (via the API or a 16-month CSV export):

  1. Unique queries (last 12 months). The count of distinct search terms the page has received at least one impression for. This measures topical breadth. A page with 400 unique queries is doing more semantic work than a page with 4.
  2. Average position (impression-weighted). A page ranking #3.2 average across its query set is structurally stronger than one at #19.8 — even if click counts are momentarily similar.
  3. Total clicks (last 12 months). Absolute traffic. The tiebreaker.
  4. Inbound link count (optional, from Ahrefs or GSC Links). For pages with external authority, score them higher regardless of query performance — migrating those without a plan is how you lose backlink equity.

The weight formula

Not every metric deserves equal weight. On the Teach on Mars migration I used: unique queries at 0.40, average position at 0.25 (inverted and normalized), total clicks at 0.25 (log-scaled), inbound links at 0.10 (log-scaled). The absolute score doesn't matter. What matters is that sorted descending, the top of the list is obviously stronger than the bottom.

The tiering

  • Top ~7% → Migrate. The keepers. Direct 1:1 port to the new domain. On Teach on Mars this came out to 30 pages.
  • Next ~7% → Review. Manual judgment call. Some get rewritten for the new site, some get consolidated into hub pages, some get dropped. Another 30 pages.
  • Bottom ~86% → Redirect. These 301 to the closest semantic parent on the new site — usually a category or hub, not the homepage. 376 pages.

The exact percentages aren't magic. What matters is that the top tier is small enough for a human to review and the middle tier is small enough for a weekly meeting. If your top tier has 500 pages, you haven't tiered — you've just sorted.

Validation, after cutover

Scoring on the way in is only half the job. After the cutover I run the same query against the new domain at the 30-day mark and compare: did migrated pages retain their unique-query count? Did the redirected pages' former queries show up on the destination category pages? Did any of the 'review' tier pages that I kept on the fence quietly die?

On Teach on Mars, 27 of 30 migrated pages retained or grew their query coverage in 30 days. Two needed content refreshes. One was a retargeting error and got re-redirected. The scoring model doesn't replace judgment — it tells judgment where to focus.

Where this breaks

  • Brand-new sites. If the source site is less than 12 months old, the GSC data is too noisy. Use GA4 sessions and Ahrefs backlinks as fallbacks.
  • E-commerce faceted pages. Filter combinations dominate raw query counts because they share a base template. Deduplicate on canonical URL before scoring.
  • Query cannibalization. If two old URLs rank for the same query set, the formula treats them as independent when they're actually competing. Flag these manually.

The deliverable

The migration plan spreadsheet has seven columns: URL, score, tier, decision, redirect target, owner, notes. Every page on the old site is accounted for. Every decision is defensible. Engineering gets a concrete list instead of 'figure it out.' SEO gets a measurable baseline to validate against. If you're planning a migration of more than 200 pages, don't eyeball it. Score it.

For the upstream MarTech infrastructure audit that feeds this — how I crawled 3,489 pages of Pardot forms across 5 languages — see Auditing 3,489 Pages of MarTech Infrastructure with Playwright. For the productized version of this full methodology, see the Migration Strategy package.

SEOMigrationEnterpriseGSC