Limit robots to one page only

Is there a way to limit robots to index one page only on my site?
I have an inventory listing service and I only need them to index the full inventory list I provide in my page.

I don’t need any other pages on my site index for any reason.

looks like this will work but I need someone with more experience at this to tell me:


User-agent: *

Allow: /SearchGBY.php
Disallow: /

Yes, what you have suggested will work.

It’s worth noting that the “Allow” directive is not part of the full robots exclusion protocol, but is used by major search engines.

It’s also worth noting that robots.txt is used to manage crawl, rather than indexing. If you get enough back links to a page that is blocked to crawlers, it will still show in the search engine’s index. To mitigate against this you’d be better to use the page-specific robots meta tag using “noindex”.