6 data sources for diagnosing a traffic drop
| | |

SEO Diagnosis: A Drop In Organic Traffic

6 data sources to check and why

It’s a common scenario for an SEO Manager. You come into the office one morning, open up your SEO dashboards and notice a large drop in organic traffic (SEO) to a core product page for your main software product.  It seemed to happen almost over night.

How do you go about diagnosing the issue?

What are some possible hypotheses for what could have caused the drop?

Before we get too far into the details. Let’s set the stage for context. Let’s suppose you do SEO for an enterprise software company that has a suite of products for developers and content managers. Your company services over 130,000 customers worldwide.  There are teams similar to yours all over the world in your primary market (English speaking) countries, i.e., Canada, Germany and Australia.

The US market represents the largest of your target markets and the .com site is ideally the one you want ranking in search engines. The size of the website on the same level as the IBM, Microsoft and Salesforce’s of the SaaS world. 

You’re an experienced SEO, and after some digging, you realize that several teams in different countries have published content that is stealing traffic away from the core US product page and this practice could also impact the rankings and performance of other products in the future.

What do you need to do to course correct with cross functional teams?

How do you go about educating teams on how to avoid this kind of issue in the future?

There is a lot to think about here. And you’ve only just finished your first cup of coffee.

Where to start looking  

First things first, diagnosing a drop in traffic means looking at a handful of data sources and formulating a hypothesis. Here are 6 areas where SEO’s should begin looking for clues, what to look for and why it’s relevant to SEO: 

  1. Google Analytics
    • Look for: Which type of traffic declined: organic, direct or referral? Did other, similar pages on the site loose rankings within this same timeframe or is this an isolated incident? 
    • SEO relevance: Knowing which type of traffic source has declined means you’ll be able to back track to the source of the issue. A decline in Referral traffic, for example, may mean some links have been broken on referral websites. A decline in Organic traffic is harder to diagnose but it largely means the source is less visibility of URLs in the SERP. 
  2. Google Search Console 
    • Look for:  Changes to the page in the Performance section. Look at data on Pages, Position, and Search Appearance. Are there any new warnings for this page that Google is flagging for you?
    • SEO relevance: This is largely where webmasters can  “communicate” with Google about their website performance so making sure primary elements like an HTML Sitemap are still visible and up to date are important to double check. 
  3. Development teams & Robots.txt
    • Look for: Ask internal dev teams when the page was last updated or scheduled to be updated (whether that’s tracking pixels, HTML code or on-page content). What was the last team that made updates to the page? Additionally, speak with the dev team leads to confirm nothing changed with the Robots.txt.
    • SEO relevance: Webmasters use the Robots.txt command to communicate crawl instructions to web robots. Bots are either allowed or disallowed from crawling the various folders of the site.  If the page accidentally were disallowed from being crawled, that’s a factor that would impact its rankings.   
  4. Enterprise SEO software tools (Searchmetrics, Conductor, BrightEdge)
    • Look for: Indications of other URLs that have begun ranking for the same term(s) that were previously ranking on your page that lost rankings. Areas like: 
    • Winner/Loser Keywords: what terms were ranking on the page before and after it lost rankings? It could be that a new, better page is in the index that Google is favoring.
    • Overall Content Relevance (E-A-T):  Some tools can measure how relevant the content is against multiple, similar pages in Google’s index. If there were any on-page content changes made recently, it’s worth investigating since it’s possible the changes were ultimately not helpful to users and the page is now underperforming. 
    • URL Rankings: Are there similar pages on the website (even other versions by country) that are cannibalizing rankings of this page? This happens frequently with large, enterprise sites. 
    • Crawl data & log files: Run a crawl to get an idea of what Google is seeing (or not seeing) on the page. Inspect log files for additional insights if available.  Also, check the backlink profile of the page. It’s possible the URL is suffering from spam and/or malicious backlinks dragging down page authority. 
  5. Competitor site performance
    • Look for: Have similar pages on your competitor sites lost rankings too? It could mean aspects of an algorithm update are affecting your industry vertical or that Google decided the page was no longer relevant for certain search queries. 
    • SEO relevance: The search intent of users changes over time. That’s why it’s important to update content to be relevant to the nature of what people are searching for and to continue improving upon existing content.  
  6. Search Engine Results Page (SERP)
    • Look for: Do a manual check to see if there new features in the mobile and/or desktop SERP (whichever device you care about getting traffic from). New elements like a carousel, more images can cause organic listings to be pushed further down the first page. 
    • SEO relevance: Simply put, Google is always looking to surface results that are based on what users want; “Google Images accounts for more than 20% of all queries Americans performed in 2018, and that’s down from a high of nearly 30% three years ago.” Research from SparkToro on 2018 search market share.

Formulate A Hypothesis – Why has the drop happened now? 

Sometimes, traffic drops due to a manual change that was pushed live to the code. Or it can be something external in the market (e.g. an algorithm update) or shift in consumer demand. Based on possible internal and external factors, begin to list out some possible hypothesis for the drop in traffic.

On the whole, there are basically two sources where rankings are affected: internal and external. 

Some of the most common reasons drops in organic traffic occur are the result of external changes such as an algorithm update or the SERP landscape changing (i.e images are prioritized).  Competitor pages can steal rankings if they begin occupying better (higher) positions on the first page. Lastly, a page or site can drop if Google suspects questionable ways of gaining rankings (black or gray hat SEO tactics) are being used (this is called a Manual Penalty).

Conversely, rankings can be affected by internal factors like manual changes made to the page by different product, development or content teams. Internal page cannibalization can occur where other, existing web pages within the same domain outperform the page you want to be ranking. 

Both internal and external factors should be considered and evaluated in order to identify the source and best solution.

It’s common for enterprise level organizations to have many teams interacting with the website at any given time. Which is why it’s important for SEO leads to have open lines of communication and relationships with all teams to quickly address any imbalances.

How to prevent future drops in traffic

In this scenario, the drop was caused by internal teams where different country pages were cannibalizing rankings of the US .com page.  

How can an SEO lead go about educating cross functional, global, teams in the future to avoid this issue?

One approach is for those in SEO leadership roles to would work with internal team leads on creating an outline as a shared resource document that is referenced (almost like a checklist) by various product, content, design etc. teams when updating and publishing important core pages. The reference sheet should include any important, on-page elements that contribute to rankings and list technical SEO requirements that need to be adhered to. Things like:

  • KWs & Relevant content: listing the primary and secondary keywords/ topics for core pages which contribute to rankings. Identify region-specific content or questions that should be addressed so that the page is relevant to local searches and therefore not competitive with other URLs.
  • Technical SEO: In this case, hreflang tags should be applied to each regional page to distinguish the content meant for each region. 

It’s an ongoing combination of maintaining SEO guideline documents in a shared location, using dashboards to monitor ranking fluctuations, and educating the broader internal organization on SEO best practices to help them become more aware of the things that negatively impact rankings. 

Now, it’s back to scanning the horizon for Pandas and Penguins. 

Over to you, fellow SEOs!

Have you had to navigate a similar experience in your org? What other data sets have you used that helped you diagnose and resolve a traffic loss issue? Or, what did I miss in my list that can be added here?  Share this article on Twitter and let me know.


This post represents my personal and professional opinion, not those of my employer.

Similar Posts