How do I prevent my website and all pages from being indexed by sites like Google, Yahoo, The Wayback Machine, etc.??
One article I found recommends this...
If you do not want your page indexed and would also like to prevent it from showing up in Google’s search results then using the meta robots tag is the most effective method of achieving both of these goals. Simply include the following meta robots tag within the head of your web page:
<meta name=”robots” content=”noindex”>
And do not block the URL using robots.txt. Let the spiders crawl the page in question. This will prevent the URL from being indexed and prevent it from being shown in their SERPs.
Currently, I have the following robots.txt file in my Web Root...
BTW, while technically all of my web pages are PHP files (i.e. they have a ".php" extension) in reality, they contain as much or more XHTML as PHP, so that is why I posted here.