Sitepoint Members,
Search engines view http and http versions of your site as two different websites, creating duplicate content of your site which lowers your search engine ranking. I found that my site appears to google as having duplicate content as https pages. Why, I don't know becayse i don't have ssl installed on my account.

The most often written about way to deal with this is to serve a different robots.txt for HTTPS

Another site said to use canonical links on every preffered page

aaand the same site also gave this php code
if (isset($_SERVER['HTTPS']) && strtolower($_SERVER['HTTPS']) == 'on') {
echo '<meta name="robots" content="noindex,follow" />'. "\n";

I guess it goes in the head just as the meta tags for no index no follow do.

Is there anything I should worry about with this code?