Sitepoint Members,
Search engines view http and http versions of your site as two different websites, creating duplicate content of your site which lowers your search engine ranking. I found that my site appears to google as having duplicate content as https pages. Why, I don't know becayse i don't have ssl installed on my account.

The most often written about way to deal with this is to serve a different robots.txt for HTTPS
http://blog.leonardchallis.com/seo/s...txt-for-https/
http://www.seoworkers.com/seo-articl...and-https.html
http://www.seosandwitch.com/2012/08/...hat-to-do.html

Another site said to use canonical links on every preffered page
http://www.creare.co.uk/http-vs-https-duplicate-content

aaand the same site also gave this php code
<?php
if (isset($_SERVER['HTTPS']) && strtolower($_SERVER['HTTPS']) == 'on') {
echo '<meta name="robots" content="noindex,follow" />'. "\n";
}
?>

I guess it goes in the head just as the meta tags for no index no follow do.

Is there anything I should worry about with this code?

Thanks,

Chris