Crawl delay directive in robots.txt file - use or not?

Hi,

I just learned about the crawl delay directive of the robots.txt file and I would like to ask a question about the proper use of it.

I have a website that is updated once or twice a week and the traffic stats show that Googlebot has crawled the site 4000+ times last month (more than 100 a day). And in total all spiders crawled 6000+ times. It is obvious that my site doesn’t need to be crawled that much and although it is not really much at this time, the spiders do consume some bandwidth.

I was considering to set the crawl delay to about 1 hour so that the spiders will visit my site less. Do you think this is a good idea?