Bonjour from medieval york York...
following a refurb of my site ive got 30 plus pages i do not want google to index. Now i know there are two ways to block the bots but am not sure what will be best.
Method 1 would be to put this in the header of the unwanted pages -<meta name=”robots” content=”noindex”>
Method 2 would be to do a robot text file which terrifies me.
But what about deleting the pages would that be a problem. These pages would have no SEO benefit to me and they only link to now defunt parts of the site.
So could i hit the self destruct button and delete pages i dont want...
Answers on a postcard please ;-)