A bit of a nightmare here - I've submitted to the search engines, without realising that the spider would follow all the links on the page - nightmare! I'm on free webspace and I don't get server logs so keeping an eye on my visitors is proving tricky, they're coming in from everywhere (as I found out after sticking a tracker on each page)
What I now want to do is go about removing all the pages on my site bar the index page from the search engines, but what is the best way to go about this?
I really don't see why you wouldn't want all your pages indexed by a search engine, but you can controll this by creating a robots.txt file. You can find instructions on how to do this here:
If I create a robots.txt file excluding all the files on my site bar the main index page, and resubmit my url, will those pages already indexed on the search engines be removed when they find that the robots.txt file won't let them access the pages it already has indexed?