How to Keep Spambots From Ruining Your Traffic Reporting
A couple of weeks ago, we ran a post alerting you to the fact that spambots may be inflating your traffic reporting in Google Analytics. Now let’s consider a couple of ways to prevent that.
Google’s Bot Filter
About a year ago, Google added a filter for just this sort of thing. It’s quite simple to implement. With Administrative rights, open your Google Analytics and click on the Admin link.
Once there, Click on View Settings:
Down near the bottom of the screen you’ll find the Bot Filter checkbox. Just check that and you’re done.
Sort of. Google is only filtering out traffic from known bots and spiders.There may be many spambots that aren’t on the list, so you may still get some of that traffic reporting. But at least it will filter out friendly spiders like Googlebot.
If That’s Not Enough
If you’re still seeing lots of spambot traffic from some of the sources we illustrated in our previous post then we may need to pull out the big guns. The simplest way to keep these bots from crawling your site is to stop them at the door. Not only will they not pollute your traffic reporting anymore,but they won’t even get onto your site, saving your web server some traffic load.
If your web host is running a Unix or Linux server, you can block these spambots in the .htaccess file. (Don’t worry if you don’t know what that is; your webmaster will know.) The folks over at ROI Marketing have done some nice work on this, compiling an extensive list of the most common spambots, and they even went so far as to provide the code you can copy & paste into your .htaccess file.
It May Take a Little Ongoing Vigilance
It may be a good idea to check for new spambots every once in awhile; these things tend to pop up without warning. But utilizing these two techniques should take a really big nite out of your referrer spam traffic.