SEOGoogle Webmaster Tools Adds 90 Days of Search Query Data, Drops 3 Features

Google Webmaster Tools Adds 90 Days of Search Query Data, Drops 3 Features

Google has announced that you can now see up to 90 days of historical data in your Webmaster Tools Search Queries reports instead of the previous 35 days. Three features – subscriber stats, robots.txt tool, and site performance features are gone.

Google has announced that you can now see up to 90 days of historical data in your Webmaster Tools Search Queries reports instead of the previous 35 days. Additional changes have been made to the Webmaster tools interface

More Search Data

google-search-query-data-webmaster-tools-90-days

As Google explains:

In order to see 90 days, the option to view with changes will be disabled. If you want to see the changes with respect to the previous time period, the limit remains 30 days. Changes are disabled by default but you can switch them on and off with the button between the graph and the table. Top search queries data is normally available within 2 or 3 days.

In addition to the updates in historical data, now you can view search query data as soon as you verify ownership of a site. You can also data for the top 2,000 queries from which your site gets clicks.

If you see less than 2,000 it means that they haven’t seen that many clicks or those clicks aren’t getting tracked. It could also mean that your data is spread out among different countries. For example, a search query for “flowers” in Google Canada is counted separately from a query for “flowers” in the U.S. Nevertheless, with this change, 98 percent of sites on the web will have complete coverage.

Less Features

Now the bad news, Google announced today that they are removing three key functions in the back end of Google Webmaster tools to give budget to newer innovations.

  • The Subscriber stats feature, which reports the number of subscribers to a site’s RSS or Aton Feeds, will no longer be accessible via Webmaster Tools. Feedburner has and will continue to provide this info.
  • The Create robots.txt tool, which provides a way to generate a robots.txt file to block specific parts of your site from being crawled by Googlebot, is being tossed due to low usage. You will now have to make this with text editor or another tool on the web to do this.
  • Site performance feature, which allows users to see the average load time of your sites pages, is also being dumped due to low usage. You should still be able to see this in Analytics using the Site Speed tool.

These three features will all be removed within the next two weeks.

Resources

The 2023 B2B Superpowers Index
whitepaper | Analytics

The 2023 B2B Superpowers Index

1y
Data Analytics in Marketing
whitepaper | Analytics

Data Analytics in Marketing

2y
The Third-Party Data Deprecation Playbook
whitepaper | Digital Marketing

The Third-Party Data Deprecation Playbook

2y
Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study
whitepaper | Digital Marketing

Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

2y