Blocking and other bad bots

Search engine optimization includes having the ability to measure your efforts. Using Google Analytics is a great way to do this (and something that I set up for every one of my clients as part of their website package). It allows us to see how people are finding the site. What happens when they get there and so much more. Again, a great tool. However. It has recently come to my attention that there are some unwanted visitors of late. Bots!

Some bots are useful. The Google bot for example. We want to encourage that one. But others can simply be a strain on resources, bloat the visitor numbers falsely and drastically increase your ‘bounce rate’ (Bounce Rate – is a metric that Google Analytics provides that shows the percentage of visitors to a particular website who navigate away from the site after viewing only one page). So the bots show up, register as a visitor, then take off right away. Not ideal by any means.

So we simply don’t want these bad bots in our stats skewing the data. One way to block them is directly in Google Analytics. While it should deflect the problem, it isn’t really as effective as I’d like. So I take blocking the bad bots even further, I don’t let them show up on the site at all (freeing up resources for legitimate visitors) directly through .htaccess. Bye bye bots!

I’ve set this up for all of my current clients as a complimentary service. I want to see all of my clients get the best service possible. Ask your webmaster about blocking the bad bots…or better yet, become one of my clients and don’t worry about it.

You can read more about this from WordPress here – Deny Access Referrer Spammers