Updates to Google Webmaster Tools Make Sites More Crawlable

While JavaScript, CSS, and linked images make websites look good and function properly, they can cause SEO headaches if those resources are blocked from crawling. Now, Google is aiming to remedy that problem by making sure webmasters know exactly which website features are being blocked.

In a blog post this morning, Google said its new reporting feature will begin with the names of blocked hosts. Clicking on the “rows” column will diagnose the problem in more detail with a list of blocked resources and a step-by-step guide to remedying the issues.

Googling is also making it easier to test sites for crawling problems with Fetch and Render, a URL retrieval feature that gives webmasters screenshots of how a page appears to Googlebot and a typical reader.

Greater transparency into Googlebot crawling issues impacts a number of issues, including “Mobile-Friendly” tags.

Related reading

10 Tips for improved guest blogging in 2020
How to utilize holiday season traffic for 2020's Q1 growth
Hootsuite social trends report for 2020
Top eight SEO trends for 2020