Fed up with the Stats…

That’s it! I simply checked the stats of my Blogspirit account and figured that 354MB traffic was way too much. I wondered why, since I know my blog won’t be read by that many users. However, the statistics list revealed yet another sad truth: Robots crawl my site as if a stampede would race over the server. Both Googlebot and Slurp create more than 200MB, and the robots create roughly 78% of the traffic in total! This is way too much, and I’d like to ask for some support here. I already modified the META for robots to „index,nofollow“ and revisit-after to „7 days“, but a robot.txt file to specifically exclude aggressive spider robots or the ability to edit the .htaccess for more usability to ban certain IP Addresses from spamming would really help… Philippe, Thomas? That’s how I made my 300th blog entry, and I’d like to continue still… ;)