How to remove/block virtually created sub domain from search engines

Hello Friends,

I need small information, How to remove/block virtually created sub domain from search engines indexing.

ex: xyz.example.com (Virtually created sub domain)
example.com (My targeted website)

My targeted website is example.com when I checked indexing status of my targeted website example.com in Google I found one sub domain got indexed with my domain name xyz.example.com. Here I don’t have any domain/folder/pages related to xyz.example.com in my server but still that sub domain got indexed in Google.

So anyone in this group can solve my problem. How can I remove/block that sub domain from search engines.

Thanks in Advance.

As far as I’m aware, the only ways to do this are to add <meta name="robots" content="noindex"> or <meta name="robots" content="noindex, nofollow"> to every page in your sub-domain, or to password-protect those directories/files.

There is no way that I know of to exclude the sub-domain using a robot.txt file. In any case, Google recommends using one of the above methods to be certain of preventing indexing of particular files or directories.

I think the best way will be to redirect that through your .htaccess file and to add a disallow rule at the robots.txt file

[quote=“mikemiller7, post:3, topic:266771”]
I think the best way will be to redirect that through your .htaccess file
[/quote]Redirect what to where? I don’t see how a redirect is supposed to help here.

What rule do you suggest adding which will block that sub-domain? I’m not aware of one which will work in this situation. In any case, as I’ve already said, Google makes it clear that you should not rely on robots.txt to prevent indexing of a page or pages.

Hi Techno Bear,

Thanks for the reply, here I have one issue I don’t have any files / pages / folders related to my sub domain in server. If we have pages then we can add meta noindex tag. All those pages are virtually created.

Everythin that is indexed from that subdomain should be redirected through your htaccess file with 301, otherwise you will have 404s.

But redirecting those pages will prevent anybody accessing them, not just search engines.

You said that there is nothing and that you don’t need them, but now you are telling me that you don’t want to prevent anybody accessing these pages? Try to clear out what you really want.

Please read the thread carefully, @mikemiller7.

@sandyharper54 has explained the situation:

So the pages are there, available to read and hence being indexed by search engines. I’m not sure what he means by “virtually created”, but if it were simply a case of removing them altogether, I have no doubt he could do that.

What he wants is a solution which will prevent search engines indexing them, but still permit others to access the pages as required.

(Note that I am not the OP.)

Yes, and I have given him the solution to disallow them for Google through the robots.txt

But as I have explained, (a) that does not guarantee the pages will not be indexed and (b) there is no method in robots.txt which will block a sub-domain like this.

If he creates a robots.txt file and place it in the root of the subdomain, then add the code to direct the bots to stay away from the entire subdomain’s content.

User-agent: *
Disallow: /

I’m 100% sure that it has no chance to stay indexed.

And I repeat that that is not what Google says.

That would seem to leave the OP exactly where they are now, with the possibility of pages being indexed if Google finds a way to reach them.

I have used it many times and till now I had no indexed pages with that method, but ok, I can agree with you and your quote that the better way is to use a noindex robots meta tag. :slight_smile:

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.