Google PSA: News Publishers Can Use Robots.txt to Block Us

Google is once again reminding news publishers that they do not have to be indexed by the search engine. All they have to do is slap some simple code on a robots.txt file to block the Googlebot.

News publishers, for whatever reason, can’t seem to understand that Google doesn’t host their content. Perhaps they think that the web is a system of interpipes that are built high in the clouds where they’re burning holes through the ozone layer or something. Sigh.

The truth is that news publishers want to charge for access to their sites, just like they charged for print editions. So, they want Google to pay to index their site. If they were truly concerned about the Googlebot, they would simply block it. But they know how much Google sends traffic to their sites. They’re just playing dumb.

Remember when newspapers did charge for access to online content? And that didn’t work out? So they offered it free with ads? Because they don’t own the news and it’s going to spread around the interpipes no matter what they charge?

So here’s my PSA to news publishers: The web has largely been built on the path of least resistance. And thou protesteth too much.

Related reading

How to lead SEO teams and track its performance effectively: Experts tips
SEO is a team sport: How brands and agencies organize work
How to pitch to top online publishers: 10 exclusive survey insights
search reports for ecommerce to pull now for Q4 plan