SitePoint Sponsor

User Tag List

Results 1 to 5 of 5
  1. #1
    Non-Member
    Join Date
    Aug 2013
    Location
    London
    Posts
    46
    Mentioned
    1 Post(s)
    Tagged
    0 Thread(s)

    Post how to block multiple website which are hosted on SSL ??

    hello all,

    i have a query, we have a SSL network for our clients websites for testing them on SSL but these all URL are indexed by google, but we can't block this indexing for a single website as this can block indexing for all the sites which are hosted on this SSL network.

    how we can do that please suggest. below is the URL we are talking about. please have a look

    URL:
    HTML Code:
    https://www.google.co.uk/search?q=site%3Asecurenretail.co.uk&oq=site%3Asecurenretail.co.uk&aqs=chrome..69i57j69i58.1765j0j4&sourceid=chrome&espv=210&es_sm=93&ie=UTF-8

  2. #2
    Life is not a malfunction gold trophysilver trophybronze trophy
    TechnoBear's Avatar
    Join Date
    Jun 2011
    Location
    Argyll, Scotland
    Posts
    6,236
    Mentioned
    265 Post(s)
    Tagged
    5 Thread(s)
    I'm not sure I understand what you're asking. The link you've posted is to Google search results.

    I'm assuming that you have a specific directory or directories which you're using for testing and which you don't want indexed. In that case you can block them with a robots.txt file, whilst allowing access elsewhere.

  3. #3
    Non-Member
    Join Date
    Aug 2013
    Location
    London
    Posts
    46
    Mentioned
    1 Post(s)
    Tagged
    0 Thread(s)
    @technobar if we have seperate robots.txt for each testing directory then why we bother for it we can easily block from it. but we have single SSL network for testing if we block one site from indexing from robotx.txt then it will block all the sites which are hosted on that SSl network


    Quote Originally Posted by TechnoBear View Post
    I'm not sure I understand what you're asking. The link you've posted is to Google search results.

    I'm assuming that you have a specific directory or directories which you're using for testing and which you don't want indexed. In that case you can block them with a robots.txt file, whilst allowing access elsewhere.

  4. #4
    Life is not a malfunction gold trophysilver trophybronze trophy
    TechnoBear's Avatar
    Join Date
    Jun 2011
    Location
    Argyll, Scotland
    Posts
    6,236
    Mentioned
    265 Post(s)
    Tagged
    5 Thread(s)
    Sorry, I'm still not sure I understand you. As far as I can tell from the results in the link you provided, you have various directories/subdomains which you are testing. If you place a robots.txt file in your root directory, blocking just these test directories, that won't affect your main domain.

    Alternatively, you can add <meta name="robots" content="noindex, nofollow"> to all your test pages. (And remember to remove it when they go live. )

  5. #5
    Non-Member
    Join Date
    Aug 2013
    Location
    London
    Posts
    46
    Mentioned
    1 Post(s)
    Tagged
    0 Thread(s)

    Post

    OK @technobar we will implement this and get back to you if all is ok . thanks for the help


    Quote Originally Posted by TechnoBear View Post
    Sorry, I'm still not sure I understand you. As far as I can tell from the results in the link you provided, you have various directories/subdomains which you are testing. If you place a robots.txt file in your root directory, blocking just these test directories, that won't affect your main domain.

    Alternatively, you can add <meta name="robots" content="noindex, nofollow"> to all your test pages. (And remember to remove it when they go live. )


Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •