SEO5 Things We’ve Learned From Google’s New War on Links

5 Things We've Learned From Google's New War on Links

Is Google taking a less tolerant stance on "spam"? Which link types are most often classified as "unnatural"? How much can exact match anchor text tactics hurt your site? Here are some conclusions and top takeaways based on recent real world work.

It’s been 18 months now since Google’s Penguin update launched and a similar amount of time since the first manual penalty messages were sent to unsuspecting webmasters.

That’s a long time in the world of digital marketing. While most industries deal with a level of change, the rate of iteration across the web is unprecedented.

Such a level of change requires an agile approach to processes. Google practices a Kaizen approach to product development and penalties, so it’s imperative that we consistently reexamine how and why we do everything.

The same rule applies to how penalties are dealt with. It’s a given that the tolerances Google allows across metrics have changed since those penalties were first introduced. Industry opinions would certainly support that theory.

Strangely, for a content led company, the digital marketing agency I run is now very experienced in penalty recovery, as a result of new clients coming to us looking for a way to market their companies in a different way.

It means, in short, that I have lots of data to draw conclusions from. I want to share our recent findings based on recent real world work, including a few key tips on areas that you may be missing while clean up is going on. Here are some top takeaways.

Link Classification

While Google has long been giving out examples of links that violate their guidelines, in recent weeks things have changed.

Until recently it was so easy to call a “bad” link you could spot them with your eyes closed. The classification was so easy it has spawned a proliferation of “link classifier” tools. And while they prove to be useful as a general overview and to help do things at scale, the pace of Google’s iteration has made manual classification an absolute must.

So what has changed?

We’ve always known that anchor text overuse is a key metric. Here are the results of a charting study we ran across those clients escaping either manual or algorithmic penalties:

Percent of Suspect Links Post-Recovery

It isn’t perfect, but the data shows an irrefutable trend toward a less tolerant stance on “spam” by Google.

I don’t want this to be seen a definitive result or scientific study because it isn’t. It is simply some in-house data we have collated over time that gives a general picture of what’s going on. Recovery. in this instance. is classed either as manual revoke or “significant” improvement in rankings and traffic over more than a month.

The Link Types Being Classified as ‘Unnatural’ are Changing

The view that things are indeed changing has been supported by example links coming through from Google in the past four weeks as part of its manual review communication.

Instead of the usual predictable irrelevant web directory or blog network, the search giant seems to be getting much more picky.

And while I can’t share exact links due to client confidentiality, here are a couple examples of specific link types that have been specifically highlighted as being “unnatural”:

  • A relevant forum post from a site with good TrustFlow (Majestic’s measure of general domain “trust”).
  • A Domain Authority (DA) 27 blog with relevant and well-written content (DA is a Moz.com metric measured out of 100).

Ordinarily these links would pass most classification tests, so it was surprising to see them listed as unnatural. Clearly we can’t rule out mistakes by whoever reviewed the site in question, but let’s assume for a moment this is correct.

In the case of the forum post it had been added by a user with several posts and the text used was a relevant and part of the conversation. It looked natural.

The blog post was the same in being natural in almost all metrics.

The only factor that could have been put into question was the use of anchor text. It was an exact match phrase for a head term this site had been attempting to rank for in the past. That might be an obvious signal and is one of the first places to look for unnatural links, but it gives an interesting nod to where Google may be taking this.

3. Co-Citation and the End of Commercial Anchors?

A lot has been written about the changing face of anchor text use and the rise of co-citation and co-occurrence. I penned a piece a few months ago in fact one the future of link building without links. It seems as though Google now wants to accelerate this by putting more pressure on those still using exact match tactics.

It is certainly my view now that links are playing a less significant role in general rankings. Yes, a site has to have a good core of links, but Google’s algorithms are now much more complex. That means Google is looking at more and more metrics to define the search visibility of a domain, which leaves less room for “links” as a contributory factor.

Given that semantic search also isn’t reliant on links and that Google has made clear its intentions to move toward this future, it’s clear that brand mentions, social sharing, and great content that is produced regularly and on point, is becoming more critical.

Links are by no means dead. Anyone that says that is crazy. But there is certainly more contributing to visibility now.

4. Check Your Page-Level Anchor Text

Penguin 2.0 has also changed the way we look at penalties in general. While it was OK to simply take a domain-wide view of link metrics such as quality, anchor text, and relevance, that’s no longer enough.

The search giant has become much more targeted in its application of penalties, certainly since Penguin 2.0. As a result, we’re now seeing partial penalties being reported in Webmaster Tools, as well as full manual actions and a plethora of other actions.

This means one thing: Google understands its data better than ever and is looking at the quality of links in a much deeper way, not just as those pointing directly to your site but even where those sites are getting their link juice from.

5. Look Out for Different Pages Ranking

One sure-fire sign of issues with individual page over-optimization or penalization is where Google struggles to index what you would consider as the “right” page for a term. This is often because Google is ignoring the “right” page and instead looking to other pages on your site.

If you see different pages ranking for a specific term within a few weeks, then it’s worth checking the anchor text and links specifically pointing to that page.

Often you may find just one or two links pointing to it but 50+ percent may be exact match and that seems now to be enough to create issues.

What Now?

The key is to be informed. Invest in multiple data source to ensure you have the full picture. You can use the following:

The above combination allows you to take a full picture view of every link on your site and gives you a second opinion should you feel it necessary. Removing links is a significant strategy. It pays to have more than one view to back up initial findings on things such as anchor text use and link quality and trust.

Alongside that, it’s worth running a check of every linked-to page on your site you can then check anchor text ratios for every one. That way you can reduce the impact of partial actions.

The key is to reduce the use of exact match anchors as much as humanly possible as tolerated percentages are only going one way!

Above all, it may be time to start thinking beyond links entirely and onto a world of “brand as publisher,” creating great content from a clearly defined content strategy, and then supporting it with an informed distribution strategy. But that’s a story for another day.

Resources

The 2023 B2B Superpowers Index
whitepaper | Analytics

The 2023 B2B Superpowers Index

1y
Data Analytics in Marketing
whitepaper | Analytics

Data Analytics in Marketing

2y
The Third-Party Data Deprecation Playbook
whitepaper | Digital Marketing

The Third-Party Data Deprecation Playbook

2y
Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study
whitepaper | Digital Marketing

Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

2y