I am listing my site with a search engine, but i want to secure a few HTML documents i.e. I do not want them to be listed in the search results of the search engine even if someone types the related keywords. How do i do that?
Thanks in advance.
There's no way to absolutely guarantee that a page won't be indexed, but the following link explains a method that most search engines will respect:
Helping Small Business Grow Online!
That'll teach you how to make robots.txt files, but they aren't acceptable for blocking out search engine spiders.
Many, many large sites disable robots.txt including IBM: http://www.ibm.com/robots.txt and Sun: http://www.sun.com/robots.txt . It looks almost as if Sun is hiding the fact that they have subscriber-only downloads (I may be wrong) by saying they aren't.
The bottom line is, if a robot can view it and know what not to index (i.e. secret files), then anyone can know and visit them.
You could place the relevant docs in a new directory and password protect them right ?
You certainly could. Shalini3010, what level of protection do you need? From the search engines or from all visitors? If it's from all visitors, go to http://www.realmz.net/showarticle.ph...b6940e5c&aid=1 (on RAGE's site...he's a member in these forums)
Who the heck is Gen. Failure and why does he want to read my hard drive?
Soon to come: Aspology.com