SEOCloaking: A Guide From Google

Cloaking: A Guide From Google

It seems Google will err on the side of purity and penalize sites not following their directions. Comments such as, "There is no such thing as white hat cloaking" show Google really needs a better understanding of how the web is developing.

matt-cutts-google-cloakingTo Google, “cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index.”

Unfortunately, like Flash before it, dynamic websites seem to break these rules, but not with any “evil” intent. As Ralph Tegtmeier, a.k.a. Fantomaster – a well-known proponent of effective cloaking, notes “almost everything qualifying for state-of-the-art these days cannot be spidered or indexed efficiently!”

The purity of the algorithm is taking on iconic status, like the “cloaking device” did when first introduced into the “Star Trek” language back in 1968 (first used in “The Enterprise Incident”). But Google, in many cases, merely avoids the heavy lifting and either creates tagging, supplemental listings, or removes content, expecting website owners to make changes to meet their impact.

In a recently released video (embedded below), Matt Cutts discussed some of the problems for both sides of the cloaking issue. His viewpoint seems to imply that Google will err on the side of purity and penalize sites not following their directions.

“There is no such thing as white hat cloaking” and “the page the user sees is same page the googlebot sees” show Google really needs a better understanding of how the web is developing.

Website owners looking to use the best technology to enhance their users’ experience already have to deal with tracking and privacy issues – just like the search engines. Google knows webmasters are generally trying to give visitors good content, but the technology makes it difficult for the spiders to fully crawl and interpret this dynamically generated content.

Just as Google developed methods to crawl Flash and PDFs, it is time now for them to do some lifting and develop a way to better filter through the problems of dynamic content. If they can read AJAX and Javascript enough to start pulling and indexing comments from Facebook, there must be a way to do something similar to content management system (CMS) pages.

Cutts’ video recognized some of the issues I addressed last year regarding geotargeting offers based on location. Adapting pages based on search queries, referrer or cookie data can be manipulated for good and bad – but on the whole it is done for profit. What Google needs to realize is a site that is good at conversions must be doing something right, otherwise users wouldn’t convert.

Saying the methods are “black hat” really is erroneous. The vast majority of sites using them are just trying to simplify the visitors path to what they want. True, there is a profit motive, but associating that with “evil” or “black hat” is inaccurate.

Surely with Webmaster Tools they could create a platform that would recognize the elements of a dynamic site and allow areas that can be changed based on various criteria. Or perhaps a content tag could be established where the sacrosanct text – text that wouldn’t be changed and can be associated to the page.

As Cutts outlined in the launch of Webmaster Tools back in August 2006 – WMT allows sites to discover crawl errors, set the redirect for domains’ www and non-www and even warns of spam penalties – and that was six years ago. The web has come a long way in those six years and Google should be able to keep up.

Google wants to show people the best possible results in the search index, or so they say. A recent Reuters article notes, however, that Google’s attempts at personalization may be hurting searchers from getting to what they really need to find.

“Google tries very hard to please you by finding you more stuff just like the other stuff you clicked on last time. That is the essence of Google’s great cleverness. But that very brilliance is becoming more and more damaging to the shared view out to an objective fact-based world,” Reuters noted.

If Google uses prior behavior to change the search results they present us, how does this not use the same techniques as cloaking? “We are now all living in what we believe to be the objective, self-evidently google-able truth. And we are not.”

The problem is Google is seen as this impartial entity by the general public, while more of an adversary to marketers. Once upon a time Google was not as adversarial, but whether the spammers caused them to change, or the entry of the government looking over their shoulders shifted their perception, they are now seen as opponents to most marketers.

Google really needs to move back in to the supportive role for SEOs. The changes in how websites generate their pages and the ability to have this content indexed should be part of all search engines’ agendas. It is in everyone’s best interests.

But the system now favors Google – they are the engine and as such everyone comes to them. The recent Senate hearings shows people are becoming more aware of the power Google has. They need to join the efforts to address the growing use of dynamic elements. People want these new sites that take less time to find what they want.

We can redirect for browser languages, we can create mobile versions and the engines can recognize these as non cloaking. It really is time to go further.

The very nature of dynamic sites and their compartmentalized page areas should work well with the search crawlers. When the canonical tag was first launched, Joost de Valk had some lead time to develop plugins for the integration of the new tag in to WordPress, Drupal and Magento.

I’m sure the people at the various CMS companies would be happy to work with Google on this. Unfortunately there is no API for the organic results, but perhaps WMT could work with developers to create a methodology.

Resources

The 2023 B2B Superpowers Index
whitepaper | Analytics

The 2023 B2B Superpowers Index

8m
Data Analytics in Marketing
whitepaper | Analytics

Data Analytics in Marketing

10m
The Third-Party Data Deprecation Playbook
whitepaper | Digital Marketing

The Third-Party Data Deprecation Playbook

1y
Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study
whitepaper | Digital Marketing

Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

1y