SEO experts know how tricky our profession can be.Today, your sites could deliver impeccable results in terms of traffic and CTR, but tomorrow they could drop overnight.Is it a Google update or a manual report?Did I lose backlinks? Is it possible that someone plagiarized my content? Is it my developers’ problem?An immediate SEO rankings drop can be caused by anything, from poorly written content to a hacked website.

1.Check for Manual Penalties
This first step is fairly simple.When a specific page or your entire site gets penalized, you will receive a note in Webmaster Tools. Like this one:As you can see from the example above, Google is pretty descriptive about the causes of this particular manual penalty.In this case, figuring out why your site has received worse rankings or been entirely de-indexed won’t be a problem.

If you don’t see any penalty-related messages in your Webmaster Tools, there are only two options left:
*Google has updated its algorithm
*You or someone from your team is responsible for the rankings drop

2.Make Certain Your Site Isn’t Affected by Algorithm Update
I can’t stress enough how important the right answer is.Because if you fail to come to a conclusion, you will spend dozens of work hours and thousands of dollars to no avail.Everybody makes mistakes.One single error in your rankings tracker may have caused the problem.Alternatively, Google might have picked up your site for rankings experiments.An ordinary server error could also be the answer:Your site might be perfectly fine from an SEO perspective, but if you fail to locate the true cause of a rankings drop, you risk ruining your SERPs for real.

Here is what you can do to avoid mistakes:
*Access your Google Analytics.
*Sift out pages that have suffered a loss of organic traffic.
*Analyze traffic dynamics for these pages (week by week and day by day).
*Review selected pages to find similarities (or differences).
*Create a list of hypothetical reasons for why your SERPs suffered.

If your rankings are substantially declining in GA, Google Search Console, and search results, atectonic shift in the algorithm may be in place.Consider adjusting your SEO strategy but don’t rush.I strongly recommend allocating at least a couple of hours to conducting a more detailed analysis.

3.Run a Detailed Backlink Analysis
To figure out if that’s the case, use one of these link analysis tools:
*Ahrefs
*Majestic SEO

The screenshot below demonstrates the dynamics of referring domains in Ahrefs.The arrow shows the time period where rankings started to drop. Obviously, there should be a correlation.You can also combine results from both Ahrefs and Majestic to pinpoint every problem in your backlink profile.Your goal is to check to see if:

*Your site has suffered a sitewide link drop
*A link drop has affected a group of pages or a specific page
*Links have been removed from a particular site or several related sites

After you locate the problem(s), analyze every page that has lost backlinks.Consider their content, structure, visual elements, and so on.More importantly, list pages that are linked to the affected ones on your site.Screaming Frog SEO spider tool will help you to do this.Now that you have two lists of pages (those that lost links and the ones that are internally connected with them), analyze your backlink sources.You need to figure out why they stopped pointing to your site. Were the original pages deleted? Did they change URLs?Was their content updated? Was their design and structure enhanced?

4.Audit Your Content
Content has a major influence on the SERPs of any website.First of all, it feeds to search engines with data about your site.The better search spiders can understand how your site fits in with a specific niche, the higher rankings they will render.Secondly, content is what brings visitors to your site. Useful and valuable content has the ability to build backlinks from trustworthy resources.Dozens of trustworthy backlinks lead to higher search results, always.My point is that content is pretty important, and you have to analyze it to figure out if it might have somehow triggered an SERP drop. Specifically, check to make sure your content is unique.To ensure uniqueness, simply run your site through a plagiarism scanner such as Viper, Quetext, or Plagiarisma.You can also use a built-in plagiarism checker at the Small SEO Tools website.

5.Analyze Your Site’s Structure and Usability
As an SEO professional, you should work with designers, developers, and usability experts to make sure every change and fix in a site’s design is justified from an SEO perspective.Your top-priority task is to ensure a site is redesigned in such a way that both users and search crawlers can navigate it easily. Your site’s structure and usability are tricky.They can trigger a rankings drop at any time, but you won’t be able to identify the cause.Thus,ifanything goes wrong, contact your designers and UX pros immediately to analyze what happened with the affected pages.For instance, someone from your team might have:

*Changed a page’s URL
*Added an intrusive pop-up that triggered a high bounce rate
*Removed targeted keywords from content and tags
*Unintentionally merged several pages
*Placed content that has not been optimized
*Tweaked a page’s design, resulting in code errors
*Messed up the internal linking structure and backlinks
*Deleted a crucial piece of content

Keep your finger on the pulse of everything that happens or is due to happen on your site.Designers and usability professionals may not know much about search engine optimization, so you should coordinate with them to avoid SEO-related mistakes.

6.Review Your Site’s Code

A site is hidden from indexation in robots.txt

Every site that is updated on a regular basis should have a dev version.Basically, this is a copy of the site that is used to implement and test new features before moving them to the live website.A dev site is closed from indexation in .htaccess and robots.txt files to prevent crawling and indexing of duplicate pages. Mistakes sometimes occur when developers move new functionality to the site’s main version, forgetting to provide access in robots.txt.A specific page or even an entire section of the website can remain hidden from search bots which,eventually, leads to a drop in rankings.

A site’s pages are tagged with “noindex, nofollow”
The same scenario is often triggered by “noindex, nofollow” meta tags.Developers noindex, nofollow a specific page when releasing new functionality onto the site and then forget to index, follow it.Search bots’ access to your page gets restricted and eventually, your site drops significantly in search results.The solution is fairly simple.Check to ensure that your developers haven’t accidentally made any SEO-specific mistakes in your website’s code after every update or fix. Make it a rule that they should notify you to look through the updated pages every time a change is made.

301 redirects are placed incorrectly
The only problem is, sometimes, even developers place 301 redirects incorrectly.As a result, you can end up with duplicate pages, which are immediately downgraded by search spiders.Ensure that your developers are properly instructed on how to place 301 redirects.One single error can ruin it all for your site, so be careful.

7.Conduct a Competitor Analysis
Meanwhile, you and your team might not be to blame for a rankings drop.Sometimes, your competitors do such a good job with their site, content, UX, and SEO that your website immediately drops in search results because of their fierce competition.To avoid unpleasant surprises, I recommend monitoring competitor sites on a regular basis.You can do it manually or use Versionista, which unfortunately is not free but is an efficient instrument that compares site differences.

Conclusion
A drop in rankings is a challenge to every SEO professional.However, there are sure-fire methods to pinpoint and eradicate the issues that tanked your site’s performance in the search results.Analyze your SEO campaign step by step, and I guarantee that you will locate the problem and put yourself back on the path to success.

Source:Searchenginejournal

Peter Zmijewski is the founder and CEO at KeywordSpy. His expert knowledge on Internet Marketing practices and techniques has earned him the title “Internet Marketing Guru“ He is also an innovator, investor and entrepreneur widely recognized by the top players in the industry.