Website Not Indexed After SSL Certificate

But a robots.txt file doesn’t specify the protocol.

Good call. I’d updated the sitemap, but in robots.text the sitemap’s URL also needed updating.

Actually, as of two days ago my site is now being crawled and finally indexed. I’ve also updated all the URLs in the HTML and looked for any discrepancies in the style sheet.

The https://www version is still not being accessed. I’ve tried adding that property but having trouble verifying it.

This almost certainly has something to do with my problem, and the lag makes it really hard to know which of my efforts is having any effect.

I’m going to look into whether my reverse proxy has anything to do with this.

It’s getting better. I’m really busy with life, work etc. and can only spend brief periods on this. Once I’ve figured this out (or as much as I think I’m going to), I’ll post an outline of all the steps I’ve taken. I imagine lots of folks are having similar issues.

I think I’ve got things working as desired now. My website and all its pages are indexed. All URL versions are directed to what I’ve designated as the canonical version.

Traffic is back to normal, and I actually expect a slight boost in the near future.

I wish I could give a clear and definitive explanation of what happened, but the truth is I have only a vague understanding of all this. Still, I think it might be useful to some if I give a rough outline of the steps that lead to a resolution.

When my hosting service issued the SSL Certificate I think they failed to anticipate all the corresponding adjustments that needed to be made. At best they failed to alert me about them.

Something that should have been obvious to me immediately without anyone telling me is that all references to the http: version of the URL in my sitemap, robot and html files needed to be updated to https: These changes didn’t get my site re-indexed, but they needed to be done.

I added https://drumdr.com/ to my Search Console properties and requested indexing for the pages. In retrospect, I should have added the www version instead, since the old http: URL used the www, and all references in my sitemap, robot and html files use the www version. It took days, but I do believe it was this step that got my site re-indexed.

When JB pointed out that my www URL returned errors I looked into everything from .htaccess to my reverse proxy trying to figure out what was happening. Tried adding the www version to my Search Console properties but could not get it verified by any method. This just added to my confusion.

By this time all my pages were indexed, most of the back links were acknowledged and traffic seemed back to normal. Throughout all this, the lag in Google’s response to my efforts made it hard to know what effect any of my efforts were having.

Eventually, a Help Tech figured out that I needed to update the DNS record at the server, specifically the CNAME. This made everything fall into place (Boy I hope so!).

I’ve added https://www.drumdr.com to my SC properties and designated that the canonical version, hoping this’ll avoid getting taxed for duplicate content.

Wish I could explain everything clearly and in detail, but I’ve already admitted to my tenuous grasp.

Thank you to each and every one of you for your feedback and suggestions. My site would be a total flop without you.

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.