The Backlink Graveyard: A Domain Rebrand Case Study

We audited 10,000 legacy backlinks from fokal.com's old life as an interior design magazine. Here's what Google actually credits and what we filtered out.

When we took over the domain fokal.com, we knew it had a past life. What we didn’t expect was how much of that past life was still pointing at our new business.

Backlink tools counted 10,036 inbound links. Google Search Console credits 20 referring domains. The gap between those two numbers is where most of the interesting SEO decisions live, and almost nobody talks about them.

Here’s what we found when we actually looked.

The old Fokal

From 2010 to 2016, fokal.com was a curated interior design magazine. Its pages showcased single-item design objects one at a time. A chandelier made from a vintage Chanel scarf. A 19th century London church converted into a home. A Halloween skull-shaped wine decanter. Each post was a short feature with photos and a credit back to the original source.

Design blogs around the world cited it the way you would cite Dezeen or Design Milk today. Some journalists discovered interesting objects on Fokal, then linked back. That curation pattern built an editorial link profile across a decade.

Then the original owner moved on. The site went dark. The domain eventually came to us, and we rebuilt it as an AI search visibility platform. Completely different category, completely different audience, completely different value proposition. The old content disappeared. The old URLs started returning 404s. The old backlinks did not go away.

Four tools, four different stories

Ahrefs maintains the largest commercial web crawler outside of Google itself, so we started there. We pulled SEMrush and Ubersuggest alongside it for cross-reference, then compared all three against Google Search Console’s own view. Four indexes of the same domain, to see where they agreed and where they diverged.

When you audit a legacy backlink profile, the first thing you learn is that backlink tools disagree. Wildly.

SourceBacklinksReferring domains
Ubersuggest10,036
SEMrush1,300304
Ahrefs (free tier)834172
Google Search Console~5020

Ubersuggest reported more than 200 times the backlinks that GSC credited. This isn’t necessarily a sign that one tool is wrong. Each commercial crawler maintains its own index, trying to count every link it can find. GSC only shows what Google actually associates with your domain. The two are measuring different things.

For audit purposes, the hierarchy goes in this order of authority:

  1. Google Search Console. The only source that represents what Google itself uses. If a link doesn’t show up here, it isn’t contributing to your rankings.
  2. SEMrush or Ahrefs. Large commercial indexes. Useful for discovering links GSC hasn’t yet indexed, and for competitive analysis against specific referring domains.
  3. Ubersuggest. Smaller index with more aggregator noise. Inflates totals but can surface long-tail links others miss.

If you are only going to run one tool, run GSC. Everything else is either a preview or a sanity check.

What Google actually credits

Here is the number that matters: Google credits fokal.com with 20 referring domains.

Not 10,000. Not 1,300. Twenty.

The 93% gap between the largest commercial tool and GSC is the clearest possible view of Google’s spam and quality filtering. Most of the historical link profile consists of scraper sites, foreign aggregators that republish snippets with link-backs, Pinterest boards from 2016, defunct social bookmarking sites, and other low-value crawl targets. Google has had years to figure out that these are not real editorial signals, and has discounted them accordingly.

That filtering happens silently. Nothing in any commercial tool tells you which links are credited and which are ignored. The only way to see the truth is to check GSC directly, which is why it has to be the anchor of any audit.

The biggest editorial link source in our profile wasn’t a major publication. It was decoracion.trendencias.com, a Spanish-language interior design blog with a Domain Rating of 72.

Trendencias had credited Fokal as a source across eight different articles, from 2012 through 2017. Each link was in-body, dofollow, and used “Fokal” as the anchor text. Every time they covered a notable home or design object, they linked back to the Fokal post where they had first encountered it.

We didn’t find this cluster in Ubersuggest. Ahrefs surfaced two of the eight. SEMrush surfaced all of them. The lesson is that commercial backlink indexes sample differently. If you only check one, you will miss real editorial relationships that matter.

A serial linker is worth more than a hundred one-off mentions. When one site has credited you eight times over five years, that is a relationship, not an accident. It also means one domain becomes the single highest-leverage target for redirect planning. Fix those eight URLs and you recover a meaningful chunk of your surviving editorial signal in one pass.

Government domains are the highest-trust class of backlink Google recognises. A single .gov.au link is worth more than a hundred scraper mentions. So when we got listed in the National AI Centre’s AI Directory, we expected a high-trust signal to start showing up in GSC.

It didn’t.

The listing page was live. The link pointed at the canonical www.fokal.com. The page was dofollow. It was in the site’s sitemap with a priority of 0.7. Everything looked right. But a site: search returned zero results, and GSC showed nothing.

The diagnosis turned out to be a discovery problem, not a quality problem. The directory is built on Next.js and served from Vercel. Its main /directory page uses client-side pagination, which means only the first ten organisations appear in the initial HTML payload. To find individual org pages, Googlebot has to crawl the sitemap or follow client-rendered links. On a lower-traffic government property, sitemap crawl frequency can be slow enough that new pages take weeks or months to index.

The fix is to help Google discover the page through paths it already crawls regularly. Linking to the listing from fokal.com, from LinkedIn posts, and from new blog content gives Googlebot multiple discovery paths. Once the page is indexed, the high-trust backlink counts automatically.

If you earn a link from a smaller government or industry directory and it isn’t showing up in GSC after a few weeks, check whether the source page is even indexed. Orphaned profile pages on SPA-based directories are a surprisingly common pattern.

What we did about it

Once we had the full picture, the decisions were simpler than the audit suggested.

1. Set up path-specific 301 redirects for the 16 highest-value legacy URLs. Not a blanket redirect of every /f-* URL. The long tail of 404s can die naturally. We only redirected paths where we had cross-validated editorial links from DR 70+ sources. Each redirect was targeted.

2. Redirected those 16 paths to this post, not the homepage. This is the decision we spent the most time on. A redirect to / would have been the conventional move. The problem is that a 15-year-old interior design backlink pointing at an AI search visibility homepage is topically meaningless. Google may treat it as a soft 404, or at best give it minimal weight.

A redirect to a post that actually discusses the old Fokal, the rebrand, and the cleanup work is topically bridged. Any reader or crawler following the redirect lands somewhere that explains exactly why the content they were looking for no longer exists. The context is preserved and the inbound link equity has a real destination.

3. Added an “as featured in” section to fokal.com. This gives Googlebot multiple paths to discover our new, high-trust listings, including gov.au directories, Crunchbase, and LinkedIn. It also signals credibility to human visitors who land on the homepage.

4. Didn’t touch the disavow tool. Disavow is for manual penalties. No penalty exists. The old link profile is background noise, not a threat.

Three things to take away

If you are managing a rebranded domain, or any domain with meaningful history, here is what this audit would tell someone else in the same position.

Audit with GSC first, not paid tools. The 93% gap between commercial crawlers and Google’s actual credit is the single most important datapoint you can get. It tells you which of your links are load-bearing and which are wallpaper.

Redirect intentionally, not universally. Blanket 301s from an old topic to a new homepage often cause more confusion than they resolve. Cherry-pick the editorial links that cross-validate across multiple sources, and redirect those to the closest topical destination you can write. Let the rest 404 and die.

Orphaned pages on trusted domains need discovery help. Even a perfect .gov.au backlink is worth nothing if Google never finds the source page. Link to your verified listings from your own site, from social posts, and from blog content. You can’t request indexing for URLs you don’t own, but you can make them easier to discover.

A 15-year-old domain history is not a liability you need to clean up. It is a signal you need to read. Once you know what Google is actually counting, the decisions about what to fix, what to redirect, and what to leave alone become obvious.

And if you happen to be in the AI search visibility business, the fact that your own backlink profile needs this kind of work is exactly the sort of thing you write a blog post about.

Eight minutes to something you can ship.