Ever wonder how much of your traffic was actual human visitors and how much of your traffic was really scrapers, bots, and other web spiders? Google Analytics will now help you discern just that.
On their official Google+ page, the Google Analytics team has announced a new filter to help site owners identify “real” traffic from that of bots and other spiders.
By setting a checkbox in a filter, Google Analytics will filter out all the traffic from known bots. This list of known bots will be pulled from the IAB/ABC International spiders & Bots List, a list the Interactive Advertising Bureau has been maintaining since 2006. The list of spiders and bots typically costs $14,000, though IAB members receive a discount.
How to Filter Bots in Google Analytics
To filter bot and spider traffic from Google Analytics, go to your Admin settings. Under the View panel, you’ll find View Settings. Toward the bottom of the options, just before Site Search Settings, you’ll find a small heading for Bot Filtering with a checkbox that reads: Exclude all hits from known bots and spiders. Check it, and you’ll automatically filter the known bots and spiders from your Analytics collecting.
Because of how this is implemented (and the cost of the list of bots), you won’t know which bots are actually hitting your site, nor will you be able to filter specific bots. But, by selecting to filter them out, you’ll see a significant difference in your traffic patterns that just might help you better understand your website visitors.
Some numbers suggest that on average, over one-third of all web traffic to a given site is an automated bot. While the filters list is updated regularly, it can’t possibly know about all the bots. You will still potentially have some bot traffic coming to your site.
If you enable the option to filter bots, you will very likely see a drop, possibly a significant drop, in traffic to your site, but your numbers will be more accurate for measuring.