For anyone interested, the recent bad-robot blocking seems to have had a positive outcome! Last week we were getting around 1.5 million hits a day from badly behaved robots in the various calendar apps around the site. (That's an average of 17 hits per second!)
This week it's down to 50,000 or so per day, and yesterday only about 10,000.
The main benefit here is that it reduces server load, thus making the site faster for real users. It also means the web stats will be a bit more believable!
Note that this is just the bad robots doing this - the well behaved robots (e.g., from Google and Bing) follow the robots.txt rules and use the sitemap.xml files, and so generate only a moderate amount of traffic. These were bad robots just going crazy constantly following every link they could find even though they were excluded in the robots.txt configuration.
Cheers
Joe