I have a relatively large site with a lot of information that I want to better protect from being scraped by BOTs and content grabbers.

How have others handled this?

My guess is that a simply ip address table that checks last visit time and time difference between now and then with some logic would be the easiest way to start the prevention of it, but wanted to hear some opinions on how others have fought the good fight.

Thanks.