SEOHow Much Optimization is Too Much?

How Much Optimization is Too Much?

When figuring out what to do with your site, see what other players in the same space are doing. This can help you identify some boundaries for your search engine optimization efforts.

This question plagues many publishers. They know they need to compete, and it’s a dog-eat-dog world. However, they also know that the search engines provide guidelines for what they consider acceptable practices, and if you violate their guidelines they reserve the right to downgrade the rankings of your site, or even remove it from the index.

You want to be as aggressive as you can, but you don’t want to be too aggressive, so to speak. So where is the dividing line?

The answer happens to be my favorite answer to all search engine optimization (SEO) questions: It depends. Not very satisfactory perhaps, but definitely accurate.

However, there is a way to do some analysis and get a sense for what you can and can’t do. Fair warning, this post represents my speculation on the topic, and I can’t declaratively state that the speculation is accurate.

The Basics

Search engines use hundreds of signals to clue them in on how to rank a given web page. One thing a search engine does is break down segments of different markets to understand typical behavior in each market. In other words, to help it decide how to rank a site in a particular market segment (for example, sites about vitamins) the search engines look at other sites in that same segment to examine their behavior.

It’s quite possible that they use this information to decide what represents spammy behavior in a given market.

An Example

Consider an issue that used to be the subject of much discussion, keyword density. Crudely measured, you can think of the number of times a particular keyword phrase is used divided by the total number of phrases of the same length on a page. SEOs don’t speak that much about keyword density any more because the days of massive keyword stuffing on web pages is (thankfully) largely gone.

But, like most things in SEO, I’d bet that there’s still a penalty for excessive keyword use. Consider a page that has a certain three-word phrase in the title, in an H1 header, is the lead phrase in several other heading tags, and repeated use of the key phrase throughout the text, and perhaps the page also doesn’t make much use of synonyms for the phrase, because the publisher is so focused on that one search term.

In all likelihood, you’re over the line at this point. So where is the line then? It depends on the behavior of the other sites in your same market segment.

Continuing our example, it most likely depends on what the most authoritative sites in the vitamin market segment are doing. A site that is rightfully thought as an authority in the vitamin market is likely to be one that the search engines want to rank in high positions for lots of related queries.

Possible Role of Authoritative Sites

So even if the site that is considered an authority in a market pushes the limits a bit, it would be undesirable for the search engine to punish them for that behavior, because it’s still the best search result for many related queries. I’m not saying that search engines don’t punish authoritative sites that engage in spammy behavior, because it happens, but they have a strong disincentive to do so.

So if the authoritative sites tend to push the limits in certain ways, such as much higher keyword density then is typical in other markets, is that spammy activity, or the new norm for the vitamin market space?

Certainly, there are still limits as to what they can do. Even Google Japan can be penalized, so there are limits. I believe those limits move, however, based on the behavior of authoritative sites in a market segment.

Purely algorithmic ranking algorithms don’t have an inherent understanding of what the norm should be for a market segment. So they rely on the significant players in that segment to help them figure that out.

When you’re trying to figure out what to do with your own site, consider what other players in the same space are doing. Bear in mind that an authoritative site is likely to get more latitude than your site (unless it is seen as the authoritative site), so copying their behavior isn’t always good idea, but seeing what they’re doing can give you some boundaries on the things you might consider doing with your own site.

Resources

The 2023 B2B Superpowers Index
whitepaper | Analytics

The 2023 B2B Superpowers Index

9m
Data Analytics in Marketing
whitepaper | Analytics

Data Analytics in Marketing

11m
The Third-Party Data Deprecation Playbook
whitepaper | Digital Marketing

The Third-Party Data Deprecation Playbook

1y
Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study
whitepaper | Digital Marketing

Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

2y