Dropped in the Rankings? Diagnose, Minimize & Reverse the Damage

Why did your site drop in the search rankings

This article is part of an SEO series from WooRank. Thank you for supporting the partners who make SitePoint possible.

It’s one of the worst feelings a website owner can experience: opening up your analytics dashboard one morning and staring at a graph of search rankings in freefall. While your first instinct might be to panic, especially if you’ve just put a bunch of work into optimizing your website, try to stay calm. Sudden, drastic position losses are not the end of the world. In fact, if you take a methodical, organized approach to figuring out what went wrong, and how to fix it, you can make sure your lost rankings are temporary and completely reversible.

Follow our step by step guide to troubleshoot what could be causing your site to lose ranking so you can fix the issue that’s reducing your organic search traffic.

Determine the Extent of the Problem

When you first see that your traffic has dropped, head to your analytics and check each traffic channel: direct, referral, organic search and paid (if you’re running paid campaigns). If all of your traffic is dropping, it could be a technical problem with your site. If just your search traffic has fallen off, that’s a pretty good sign that you’ve got an SEO problem.

When trying to diagnose the reason behind your lost rankings, first you need to determine the scope of the issue: Is it related to a few individual keywords, or is it your site as a whole that’s losing rankings? Your problem could also be caused by a group of keywords or your pages within a particular category.

Those with a WooRank Advanced Review of their sites can use the SERP Checker to check their keyword rankings. It could be the case that every keyword lost ranking at the same time, which would indicate there’s a problem with your site, or your site has been hit by a site-wide penalty. Alternatively, if only certain keywords lost position, you probably have a problem with those particular landing pages or keyword categories.

SERP Checker track keyword position

For example, if you’re a garden supply store, it could be that a drop in traffic could be caused by a site wide issue or a loss of rankings for your “BLACK+DECKER cordless electric mower” keyword. Or, you could be losing ranking for every keyword in your lawn mower category.

Sadly, not everyone has a WooRank Advanced Review, and therefore can’t take advantage of the SERP Checker. If that’s the case, you can use your Google Search Console to check how keywords rank. Head to the Search Analytics report and make sure you’re looking at data for queries (this should be the default view) and check the box for Position — clicks won’t tell the whole story.

Google Search Console query position

Look at positions for individual keywords to spot sudden, drastic ranking drops of 10 or more positions. See if you can spot any patterns or trends to the keywords that lost ranking. Or, if every keyword has seen a huge drop, there’s a good chance you’ve got a site wide issue on your hands.

View the position of your landing pages as well. Look for a pattern around what sort of pages are losing position and traffic. It could be that a particular type of content is losing its ranking power, or maybe a feature you’ve rolled out (such as Flash content or a widget that contains outbound links) to certain pages is hurting your SEO.

Diagnosing the Issue

So after you’ve dug into your analytics data and determined the scope of your ranking drop, it’s time to start diagnosing exactly what caused the lost positions. There’s going to be a lot of reports and tools thrown at you from here on out, but don’t get intimidated, it’s not as complicated as it looks.

Check Site Status and Robots.txt

Of course, if Google can’t access your site it can’t rank. So make sure your site is up and running. If you haven’t already, visit your site now to make sure you can access it. Use a variety of devices to make sure you can view your desktop, mobile and tablet versions. Check the Crawl Errors report in Google Search Console to find if there’s an issue that’s only affecting some of your internal pages. Check the guide to robots.txt to troubleshoot errors in robots.txt files.

One common issue that blocks Google from crawling your site is issues with your robots.txt file. First, check to make sure you aren’t inadvertently blocking crawlers to your whole site, which would look like this:

User-agent: *
Disallow: /

This is one of the most common issues with robots.txt files, especially if you’ve recently migrated your site.

You can use the Fetch As Google feature in Google Search Console to make sure there isn’t a problem that prevents Google from accessing your pages and the robots.txt tester to make sure your robots.txt file is written correctly.

If the robots.txt file is correctly written and submitted to Google, verify that you haven’t accidentally disallowed any pages you want to appear in search results. You could go through your file line by line, but that will take forever if you have a lot of disallowed pages. Instead use the site: search operator. Just do a Google search for site:yourdomain.com, and Google will return the pages it has in its index. Make sure your meta description appears as it should, as disallowed pages often still appear in the results, but with a message to highlight that that a description for the result is not available. Your most important pages should be appearing at the top of the rankings, so if they’re not there’s something going on.

Audit Your Site

There’s always the possibility that your site isn’t as optimized for search engines as you think, or you may have inadvertently hurt your SEO by a change you recently pushed live. So the first step in figuring out why you lost ranking is to do an SEO audit of your site or affected pages. The WooRank audit analyzes more than 70 SEO factors, so you don’t have to go digging through your page code and content to spot those hard to find issues.

WooRank SEO audit

Is your score much lower than you thought it would be? Look for important factors that are missing like title tags, meta descriptions or <h1> tags, and double check that your site doesn’t have common SEO problems like broken links, putting content in Flash or iframes, or failing to implement a WWW resolve.

Find a good on page SEO checklist and go through each item to make sure that you’ve correctly implemented important factors like canonical URLs, robots meta tags and URL rewrites. Problems with these will result in duplicate content or a failure to get crawled.

If your pages are currently well optimized for search engine traffic, evaluate your page content’s quality and relevance to the keyword. Now, you might already think it’s high quality and very relevant (you did publish it after all), but your visitor metrics might be telling a different story:

  • Low click-through rate (CTR) – This is a good sign that your page content, or the messaging you use to describe your page, doesn’t match with what searchers are trying to find. For our garden supply store example, it could be that people searching for “BLACK+DECKER cordless electric mower” were trying to find information about gas vs. electric mowers. Search behavior could have shifted to expect a guide to evaluating and purchasing an electric mower.
  • High bounce rate – Bounce rate is another good measure of the quality of your content and user experience. If a high percentage of your search traffic leaves without interacting with the page, or spends a very low amount of time on site, that probably means your content isn’t meeting their needs.

If you’ve recently moved content around or gone through a site migration or redesign, make sure you’ve set up your redirects correctly. Both 301 (permanent) and 302 (temporary) redirects pass full link juice, so it’s absolutely vital that you redirect to the new URL from the old URL. Use a tool like Screaming Frog to crawl your site to find all your redirects. Upload a list of all the URLs for pages that you’ve moved. Once it’s done crawling, sort the results by status code and make sure each URL is returning its proper code.

Screaming Frog crawl

If you only have a few pages to check, you can manually verify the redirects are working using Ayima’s Redirect Path browser plugin.

Finally, check your XML sitemap. Are the pages losing rankings included in your sitemap? If they’re not, Google might not be able to find them – especially if you don’t have an internal linking strategy. Test your sitemap in Google Search Console to make sure there aren’t any errors that are impeding Google from crawling all your pages.

If your technical and on page factors aren’t causing big ranking drops, do a link audit. Gather all your backlinks with a tool such as Majestic or Ahrefs. Problems with links could be causing you to lose SERP position in a couple of different ways:

Broken or deleted links: When you’re looking up your backlinks in Majestic you’ll see links that have been deleted marked as such. See if the link destination matches up to the pages in question. Links are an important ranking factor, so losing them will definitely impact ranking.

Export your list of links as a CSV (you have to have a paid account with Majestic to do this), and then crawl the linked pages using Screaming Frog. Filter by Client Error (4xx) to find your broken backlinks. Reclaiming these links, and restoring the flow of link juice, should be as easy as fixing the 404 errors.

Screaming Frog broken links

Low quality links: If you’ve had your site for a long time, or you’ve recently hired some outside help to do link building, there’s a chance your backlink profile has picked up some links from some not so great sources. Since Google just released Penguin 4.0 to run in real time you could be seeing a loss of ranking due to low quality backlinks. Link tools like Majestic and Ahrefs will score your links to help you determine if links are helping or hurting your visibility.

Negative SEO: Negative SEO refers to intentionally using spammy link building techniques to tank another site’s ranking. While it’s really rare, it does still happen. So, while you’re digging through your links during your link audit, notice any patterns with anchor text, linking domain or the text around the link. Negative SEO attempts will generally create tons of links with the same anchor text, linking site type (shady forums and/or directories) and text around the link.

Fixing your deleted links should hopefully be as simple as reaching out to the linking site’s owner and convincing them to restore your link. If you discover your freelance link builder filled your profile with a bunch of spam, or you’ve suffered a negative SEO attempt, you can also make use of Google’s Link Disavow Tool to clean up your backlink profile.

Check for Manual or Algorithm Penalties

You might have accidentally done something that made Google think you were trying to manipulate search rankings. It could have been that “cost-effective” SEO you hired stuffed all your title tags with keywords, a negative SEO attempt or something as innocuous as incorrectly implementing some code on your page. If it’s bad enough, a Google employee will manually penalize your site. If that’s the case, they’ll tell you in your Google Search Console account. Check for messages about manual penalties in the Manual Actions sections under Search Traffic.

Google Search Console manual actions

You could have also run afoul of one of Google’s spam-fighting algorithms. Since you’ve already done a link audit, you’ve handled Penguin. Use a plagiarism detector to check if your page content is close enough to other sites to look like duplicate content.

Look Up Google Algorithm Releases

Finally, if all else fails, check for new Google algorithm updates. Do some poking around the web at SEO sites and message boards. Barry Schwartz at SEO Round Table usually covers signs of a major update and will post it quickly. Webmaster World maintains a thread dedicated to Google updates and SERP changes. Lots of activity here is a sign that Google has made some sort of change. There are also algorithm tracking tools like Agloroo, SERPmetrics and AccuRanker that will help you find major fluctuations in SERPs that coincide with your ranking loss.

If you’re using Searchmetrics to track your online visibility, you’ll see known Google updates so you can find out if an update has hurt or helped you.

Searchmetrics ranking loss with Google updates

Conclusion

Figuring out the root cause of a ranking drop is absolutely crucial. How can you fix a problem if you don’t know the cause? While staring at that downward trending graph in the face can induce heart palpitations in even the most seasoned marketers, approaching the diagnosis in an organized fashion will allow you to minimize and reverse the damage quickly.