Help! Stop individual folder being crawled and indexed

Hi guys,

Not sure whether this is the right section to post this question, but I was wondering whether you could help with stopping google spiders etc crawling and indexing a particular folder on my site.
I searched around and found this method.

User-agent:*
Disallow:/foldername/

Is this the best way to do this? Do I just upload the file to the root folder?

Legitimate spiders all generally look for and obey the content of the robots.txt file in the root folder. Bad bots may also read the file but will not obey the instruction.

Thanks felgall!

So, basically this is the best that I can do, just minimise the amount of indexing?

You could also consider spanking those naughty bots by creating a black hole for them, as described here:

:slight_smile:

Thanks so much ralph.m