Google Analytics’ New Bot Filter Helps Clean Up Data
By Yevgeniy Levich
Data integrity is key to being able to use Google Analytics to track the health of your site and make useful conclusions from traffic data. Google Analytics’ new bot filter makes seeing legitimate site traffic simpler.
This filter automatically blocks all hits coming from known bots and spiders on the Interactive Advertising Bureau’s (IAB) fairly extensive list of page crawlers.
Bot and spider hits not being filtered out can cause serious data integrity issues depending on the popularity of your site. Each bot crawl being recorded as a hit, or visit, will artificially inflate your total traffic numbers and bounce rates. These artificial, non-consumer visits can also lower average transaction volumes, time on page, and other useful metrics.
Previously, you were able to block these bots through manual profile filters targeting each of their User Agents. This new filter is a more efficient and holistic approach to the issue, preventing you from having to search out all known User Agents and giving you a form of access to the IAB list without having to pay for their, fairly pricey, membership.
As with most Analytics filters, however, this will not affect your data retroactively and will only be in affect from the day you tick that check box.