First of all I'm assuming it is search engine related, if not I can post it in a more appropriate forum. If it is search engine related how do you write one? Any help would be appreciated.
When a search engine spider comes to look at a page it first checks to see if there is a robots.txt file. This file can be configured to exclude spiders from different parts of the site.
There is lots of information about how the robots.txt works at http://info.webcrawler.com/mak/proje...ts/robots.html or try AltaVistas FAQS at http://doc.altavista.com/adv_search/..._avoiding.html
The default is to index everything and it doesn't matter if you don't have this file unless you want to exclude spiders.
Thank You! Those two links should answer all my questions.
Anyone knows a good website that can check the robots.txt file. I got one but it did not work well.
It showed that something was wrong but it did not.