I run services with SEO. I want to exclude some addresses by the sitemap but am unable as some of them are considered important even if I try to exclude them.
I tried giving not in the URLs but that too in vain as they contain sub strings…
and am not able to retrieve pages.
Not sure I understand fully but have you tried using the robots.txt to exclude certain files from being indexed?
If it’s not an indexation issue then can you manually create your sitemap and just put in the pages you want?
Or am I missing something?
What script are you using to generate the sitemap. Some scripts have a config file where you specify the rules on what to index.