I am wondering the best way to de-index URLS without causing major problems. A year after starting I feel I may have over optimized many URLS and I want to change that e.g.
Should I just tell robots txt not to crawl the root of the /wedding-photographers//wedding-photographers/East-Sussex/wedding-photographer-East-Sussex-adam-bronkhorst-26518/wedding-photographers-index.html
Will this slowly remove things from the index? I can't face adding every URL individually in the removal tool in WT..