I was wondering how people are able to create ajax applications like star ratings scripts without having bots hit the scripts. I have a star rating script on my site which was getting hit by bots. After I put in rel="nofollow" into the link part that cut down on the nice bots but I just started getting hit by some bots again.

I have had a good amount of experience dealing with bots in the past with them sumitting my forms that were open to all visitors. After I created a "What is 1+1" hack as a check on the forms then the problem was solved. But I like my ajax scripts to conform to the ajax philosophy of keeping everything as quick and simple as can be so having confirmation on a star-rating script would be kind of weird. When you look at sites like bash.org with their + and - voting, Craigslist with their flagging, or the tons of sites that now use ajax star-rating scripts I am wondering how they are able to keep bots from contaminating the input? The only thing I can think of is that they ban them totally but this seems like it would be hard.

I should mention that I know about the various ways to deal with bots:
1. site wide - IP banning & robots.txt
2. page specific - meta tags
3. link specific - rel="nofollow"

But the problems are:
1. Site-wide: It would be a pain to constantly ban all non-good bots. Or I am wrong about this?
2. Page specific: For a site that has a couple hundred static PHP pages it would be hard to manage the creation and editing of meta tags.
3. Link specific: This doesn't keep bad bots, and sometimes good bots, from not following the link. (Some people even say the good bots still hit the link but just give it no PageRank, but it doesn't really matter)