How to block multiple website which are hosted on SSL?

hello all,

i have a query, we have a SSL network for our clients websites for testing them on SSL but these all URL are indexed by google, but we can’t block this indexing for a single website as this can block indexing for all the sites which are hosted on this SSL network.

how we can do that please suggest. below is the URL we are talking about. please have a look

URL:

https://www.google.co.uk/search?q=site%3Asecurenretail.co.uk&oq=site%3Asecurenretail.co.uk&aqs=chrome..69i57j69i58.1765j0j4&sourceid=chrome&espv=210&es_sm=93&ie=UTF-8

I’m not sure I understand what you’re asking. The link you’ve posted is to Google search results.

I’m assuming that you have a specific directory or directories which you’re using for testing and which you don’t want indexed. In that case you can block them with a robots.txt file, whilst allowing access elsewhere.

@technobar if we have seperate robots.txt for each testing directory then why we bother for it we can easily block from it. but we have single SSL network for testing if we block one site from indexing from robotx.txt then it will block all the sites which are hosted on that SSl network

Sorry, I’m still not sure I understand you. As far as I can tell from the results in the link you provided, you have various directories/subdomains which you are testing. If you place a robots.txt file in your root directory, blocking just these test directories, that won’t affect your main domain.

Alternatively, you can add <meta name=“robots” content=“noindex, nofollow”> to all your test pages. (And remember to remove it when they go live. ;))

OK @technobar we will implement this and get back to you if all is ok . thanks for the help :stuck_out_tongue: