AI: The Shiny New Object in Search

It’s apparently a new day in Search. Why? AI (Artificial Intelligence) is officially integrated into two notable search engines, Google and (the new) Bing.


I finally had a chance to digest some of the headlines and read a few industry articles on the latest shiny new object in SEO: AI in Search. Here’s an overview of what I think is interesting and will be the trends to watch.

Interestingly, in Google’s attempt to beat Bing to the punch announcing their AI product it came off as the product being rushed to market and the presentation itself wasn’t as well coordinated as it should have been resulting in the company stock tanking.

Alphabet lost about $100 billion in value after a demo meant to show off the AI-powered chatbot bungled its response on the NASA telescope.

https://www.cnet.com

That is not a headline any investor wants to wake up to.

For context, Google announced on February 6th the introduction of Bard to Search. At first, based on the location of AI in their SERP, it appeared to be a revamped version of a Featured Snippet, an organic feature that sits at the top of the results page.

What is the purpose of artificial intelligence?

Google’s product announcement made me wonder what queries they were trying to target. What problem were they trying to solve? Reading through their announcement, though, I think this key concept completely flew under the radar:

“AI can be helpful in these moments, synthesizing insights for questions where there’s no one right answer. Soon, you’ll see AI-powered features in Search that distill complex information and multiple perspectives into easy-to-digest formats, so you can quickly understand the big picture and learn more from the web”

Google AI Search announcement

Still, it wasn’t until I read Brody Clarke’s article where he talked about “NORA (No One Right Answer)” that I realized incorporating AI in search is trying to solve for informational searches with no one right answer.

What is artificial intelligence in simple words? It’s an Ask-Me-Anything machine.

Sample AI question prompts from the New Bing homepage.

A tale of two products – Google Bard & Bing AI Chat

Google Bard: Citations – Out, damned spot; out, I say! —One, two: why then, ’tis time to do’t.

Bard…Shakespeare. Get it?

The key drawback of how AI is being presented, at least initially in Google’s interface, is that the AI responses do not provide clear links or citations to web publishers. Glenn Gable’s article points this out as “Google’s war against publishers” and he’s not wrong. But I think the behavior we’ve seen with Featured Snippets tells a similar story because there was a similar argument with the emergence of zero click search results: “if I can see the answer in the SERP, why do I need to click on the result?” Everyone was worried about their organic traffic and CTR when zero click results emerged.

Over time, earning the Featured Snippet spot did result in some organic traffic as those consumers who desired to learn more clicked through. Gabe added, “It’s also worth noting that Google has not answered any questions about
citing sources. And I mean literally nothing has come from anyone at Google about linking to publishers (which makes me think they were unprepared for the question). That’s also scary…so we’ll see where this ends up.”

That should have been a core tenant of the product. It’s very strange that no one commented on citations.

On a separate but related note, what’s come along with the announcement of AI driven content is speculation that people think their jobs will be taken away because AI will create all of this incredible content at scale. But that’s a fallacy.

AI responds to prompts from humans and can only learn from what humans teach it or feed it. It can’t think for itself. And it’s prone to inaccuracies and biases. AI generated content needs and will continue to need human oversight.

Lily Ray makes a solid point about the delay with information used to train AI content generation tools in her article:

“For example, ChatGPT was trained on data ending in late 2021, although the tool does appear to be improving to reflect more recent information. Given the importance of fresh information in so many areas of SEO, this is a significant limitation for the tools to be able to produce entirely helpful content.” 

In relation to the SEO community, the guidance Google emphasized about content creation was that it comes down to the quality of the product. AI can be involved in generating content (there’s no penalty for this), but if it isn’t helpful to users, it will get weeded out by the systems & algorithms in place (i.e. Helpful Content Update).

Marie Hayne’s newsletter highlights this as well: “It is perfectly fine to use AI in your content creation efforts as long as your end product demonstrates E-E-A-T and is helpful to people. However, Google does not recommend you list AI as the author of your content.”

The extra “E” is for Experience.

Bing AI Chat contains citations (peasants rejoice!)

The next day, Microsoft had a huge event in Seattle where they showcased their AI Search product. Their implementation was well thought out and wow’ed the media. Bing’s AI chat attributed the information to sites where it “learned” about the answers it was providing.

This is a step in the right direction in terms of product attributes but, without the volume of users Google can drive, it’s small potatoes to publishers.

What’s interesting about the approach Bing is taking, in their POV, is that it’s time for a new approach. Barry Schwartz’s attended the announcement and posted about it on Twitter noting:

  • Bing will launch an all new search engine with AI powered. It’s better, will answer questions, you can chat with it and it can create content for you.
  • 40% of all queries result in someone clicking back and most searches are three keywords or less…
  • he (Yusuf Mehdi) explains navigational, informational and other searches people do today. 50,000 people’s searches go unanswered, which is why it is time for a new approach with search.

Product War: To cite or not to cite

Whether or not annotations are visible in the SERP, IMHO, is irrelevant. The important piece for publishers is being rewarded with organic traffic when the AI result actually clicks through to the publisher’s site. Regardless of whether it’s a Google or Bing search, if the No One Right Answer section from AI doesn’t link directly to the original source(s), that’s when you’ll really have an uprising on your hands.

At that point there’s zero incentive for publishers to invest time and resources creating content if the level of organic traffic as a referral source in your analytics doesn’t deliver.

Speaking of citing sources, what if there’s a scenario where you don’t want AI chat bots learning on your content i.e. subsequently misquoting or using it out of context? I thought that was an interesting consideration seeing this article in the SEO FOMO newsletter by Aleyda Solis: How to Block ChatGPT from learning on your website, by Roger Monti

The technique involves utilizing the Robots.txt but it’s not a guarantee. At the very least, it’s nice to have an idea of how to attempt to do this if such a scenario presents itself. 

Aleyda always does a great job curating her newsletter content. I highly recommend signing up.

AI trends to watch

  1. Product dominance will win.
    Will Bing’s AI product tip the scales in their favor where consumer behavior will change and more users will start with Bing vs. Google? IMO, I don’t yet
    think this is likely for a 10-20 years because, as Gabe pointed out, “Google typically drives exponentially more traffic than Bing” (his post offered a handful of comparisons showing the difference between GSC and BWT Clicks & Impressions data). The adoption curve is still too great.
  2. Being second best in the market means more room to test & iterate.
    Microsoft has multiple sources of revenue and Search is a relatively small one at this point. They have the advantage at this point because they can afford to “test and learn” their way from the product they launched this month to a more refined, mature product. Whereas Google has to worry about its impact to Search; bigger changes are riskier because they have the lion’s share of the market.
  3. There will be short term & long term effects of AI – and they’ll be very different.
    Gabe’s idea that AI is headed towards manifesting “Jarvis for Search” (as in the AI assistant created by Tony Stark) got me thinking. I think it would be incredible because Jarvis is Stark’s right hand and intuitively knows what,
    when and how he needs answers or action. But I’d only be on board if it operated in a closed network; can you imagine your Jarvis inundating you with advertising or spam?

Final thoughts

IMHO, it’s one thing for Bing to say it wants to provide links to publishers in their AI chat.

But the bigger question is “will more people use Bing as opposed to Google?”

User behavior is a completely different animal; influencing it…changing it…creating new habits…that takes a long time. In 2023, Bing is not the current market leader when it comes to search engines. Google is. And by a very large margin.

Worldwide desktop market share of leading search engines from January 2015 to December 2022. Source: Statista 2023

Hypothetically, even if Bing has a superior search product with the addition of AI, it’s still all about adoption and preference and doing what’s easiest.

The consumer POV is: “What app is on my phone that I feel gives me the fastest, most reliable, accurate results?” That wins every time.

As far as No One Right Answer results go, take it with a grain of salt. These AI products were basically rushed to market and there’s a long way to go towards improving their outputs.

The startling news just keeps coming:


For the record, none of this was written using AI. I used my brain, my experience as an SEO professional and a keyboard to write out my thoughts.

The information contained in this post does not reflect the views of my employer.

What I learned from improving a website’s header navigation: the road to page 1

This post is a short case study on an in-house initiative I began with my new team in January implementing recommendations from our agency to increase the number of URLs in our header navigation. I’m fortunate, as the acting SEO Manager, to benefit from the previous SEOs who have overseen the site and laid the foundation for growth.

For additional context, below are a few business stats:

  • The business does about $1M in revenue and there are roughly 2 million product SKUs (and growing) in the appliance parts vertical.
  • The Clicks & Impressions trend (over 6 months between Oct-March) is approximately 9M Clicks and 140M Impressions.
  • Even though it’s an e-commerce site, peak seasonality is more closely aligned to the spring and summer months as opposed to the typical Q4 retail holiday season. In any case, I’ll keep this one short because it’s more about sharing the findings and observations in the 3 months since implementation.

I can’t take full credit for this; in my current in-house role I’m working with a very strong technical SEO agency, Merkle, and this project had been part of their recommendations for our site. I will, however, take credit for shepherding the initiative and driving the implementation and launch with our cross functional teams. This project was one of the early stage technical initiatives on our SEO Roadmap because it’s the kind of SEO investment that has early stage results and pays off in the long run.

It’s the best of both worlds and I’m excited to be able to share the early results.

SEO product feature statement:

The issue was there were no links in the header for Googlebot and users to easily discover our best pages.
If we increase the number of links from 32 to 140,
then, we will increase the number of unique target links to pages with a high Search Volume making them more visibly accessible to our users and discoverable for Googlebot.

Implementation timeline

Sorry to disappoint, but I’m going to skip over how many sprints it took to implement the updated header nav across devices because, in reality, each team and its respective resources is different. Suffice it to say things always tend to take a bit longer than you’d like, so plan accordingly and communicate often to leadership as well. Pro tip, try not to release any big feature going into the weekend (especially a long holiday weekend); release it on your typical cadence (ideally mid-week) because that allows you to have the core team available during the work week to revert any changes or fixes that were missed and slipped through the QA cracks.

Ok, let’s FF (fast forward) to the results.

Early Results & KPIs


We did it! Now what?

This is what I like to call the “the hurry up and wait” part of SEO. In this scenario, we launched the feature in late January. Our SEO project KPIs were to…

  1. Increase the URL Rankings of these pages.
  2. Increase Traffic (Visits) to the site.
  3. And yes, as an e-com business, also impact Conversions (Sales).

SEO Rankings

As a best practice, the industry standard is to allow at least 90 days before seeing SEO results because you need to allow for Googlebot to come back to your site to crawl and index your site to discover (and subsequently begin to rank your pages) after the changes. As of April, we’re about 70 days in and here’s what we’re seeing:

You’ll notice in the charts below it was pretty exciting to see within the first few weeks the anchor text keywords and associated URLs began gaining rankings on page 1 of the Google SERP.

I’m most excited about this chart since it tracks the progress of our top selling products:

Check out the fluctuation the week of March 1st! Phantom gains?!

This chart highlights the top brand names we carry (based on demand AKA Search Volume):

This section fluctuated the most: the pages themselves resemble a collection of products for a category like “cookware.” In this group there were roughly 9 URLs that just never gained traction.

SEO Traffic

In terms of traffic (Visits) to the site, typically I like to should show a pre/post level of Traffic on each URL so that we can compare apples to apples. But in this scenario, I can share the Traffic results as such:

Post launch, roughly 70 days until now (the early part of April), the new URLs in the header nav are contributing 35% of site Visits. This excludes items that existed in our header nav that we intend to maintain (staple links like “Your Account” and “Orders”). None of the new URLs were previously in the header nav so that’s an additional 35% of traffic on top of the URLs that were contributing, on their own, roughly 56% of Visits. Not a bad return.

What didn’t work

As mentioned earlier there were 9 URLS within a general accessories type of category that seemingly never picked up any traction in terms of rankings. Internally, we drew up a few ideas and hypotheses about what might be occurring but in the end it came down to a business decision to remove the links due to the upcoming seasonality.

In any other scenario where those pages were valuable to the business but were underperforming, I’d recommend a test and learn approach. Especially if those pages drive some amount of business revenue. In our case, the business cycles in and out of its seasonally relevant products.

Conclusion

Because it takes time to realize and see SEO results, it’s important to prioritize the foundational improvements that can have the biggest impact (improving things like site architecture, internal linking etc). You’ll need those early wins to build credibility with internal teams and within the organization to demonstrate that Natural Search traffic, among the other marketing channels, is a viable contributor to the business bottom line.

It reminds me of a line from the movie Moneyball between characters played by Brad Pitt and Jonah Hill. In one scene, Hill says, “your goal should be to buy wins.” Well, fellow digital marketers, we cannot buy the SEO equivalent of wins which one could argue are page 1 rankings, but we can invest in doing the right things, consistently. In the movie, the duo hedge their bets on building a team of players that “get[s] on base.”

That, my dear marketer, is why I wrote this small snapshot of our project as an SEO case study. To show you that you can “get on base.”

A small professional confession

I had some pretty bad imposter syndrome writing this and deciding whether or not to publish it because there are some very smart and talented SEOs that could probably get a better or faster outcome than I did. But here’s the thing, (“you know nothing, John Snow.”) every team and company or client has its own dependencies from internal resources, tech stack, legacy site issues, internal process & communication, to technical implementation etc.

Under my watch, I’m proud to say we launched it and we’re seeing early positive results that will inform future iterations of this project.

Now over to you: What are your thoughts or what have you done differently that produced different results? Tweet me or leave a comment below.

The opinions, thoughts and perspective expressed in this post are my own. While I am a representative of the company, these are not necessarily the views of my employer.

One day at Google: Webmaster Product Summit

On Monday, November 4th, around 50 or so SEO’s descended upon the Googleplex campus in Mountain View, California to attend an invitation only one-day Product Summit.

There was much of this…

IMG_1446

See what I mean? SEO’s love Twitter btw. There are tons of insights under the event hashtag #GWCPS. I highly recommend combing through it when you have time.
Thanks in advance to Martin MacDonald for being my model. Your check is in the mail 😉

IMG_1809.jpg

They let us ask A LOT of questions tossing around a tiny speaker box…

IMG_1825.jpg I think for the most part, everyone had a great time, felt herd and both sides enjoyed the chance to learn something from each other.

IMG_1806.jpg

I had a fabulous time catching up with friends. Talk about #MondayMotivation!

IMG_1287.jpg

And throughly enjoyed the opportunity to meet more industry pro’s in person. Yup, that’s me trying to keep my cool talking SEO shop with Glenn Gabe 😛 I’m a big fan of his digital marketing blog and really enjoyed chatting with him in person.

IMG_1864.jpg

Did I mention there were snacks? And breakfast AND lunch?! Their food team is amazing to support feeding their day-to-day staff and accommodate our special event that day.

IMG_1743.jpg and gift bags?! The Google Webmaster team really took care of us.

IMG_1272.jpg I’m incredibly grateful to have been a part of this event.

IMG_1779-Pano.jpg

Tips & Trends from the Google Webmaster Conference

Ok, let’s get down to brass tacks. What were my takeaways?

Snapshot of the agenda & event

What: Google Webmaster Product Summit
When: Monday November 4, 2019
Where: The Googleplex in Mountain View, CA
Who: hosted by Google, in attendance were many well-known speakers, consultants & SEO practitioners alike.

These are just a handful of Search folks I recognized & got to mingle with: Micah Fisher-Kirshner, Denis Yevseyev, Martin MacDonald, Loren Baker, Michael King, Jennifer Sleg, Jackie Chu, Glenn Gabe, Barry Schwartz, Sung N., Elliot Mellichamp.

Why: The origin of this event was conceptualized as a “meet the ecosystem” initiative, a two-way street where webmasters and the core search Product Management teams could interact.

How: Google really does think about webmasters and content providers. The day’s events were organized to include brief talks from product leads and an open forum Q&A led by former SEO veteran, Danny Sullivan (@dannysullivan).

Screen Shot 2019-11-10 at 3.58.34 PM.png

In return (for basically providing food and free WiFi), Google asked attendees to refrain from revealing specific individuals in discussions involving this event.

Industry SEO’s Published Event Recaps

There were a handful of fantastic recaps that surfaced immediately following the event. These were:

Takeaways: Google Webmaster Conference Product Summit by Barry Schwartz, on SEO Roundtable provided great oversight into the technical SEO aspects discussed.

and his second article, 5 Tips and Trends from the Google Webmaster Conference on SEL. 

Plus a great, play-by-play recap by Jackie Chu (who was writin’ up a storm from our row!) published her notes from the Product Summit on her blog.

I can’t forget the numerous folks live Tweeting insights throughout the day. You can find the thread under the event hashtag: #GWCPS.

Insights from Product Fair  

The afternoon session featured a section called “Product fair” where all of the attendees could meander around the room speaking with the various product leads of Search. Much like a Science Fair, each product manager stood by their product board to answer questions. I think a lot of large, enterprise-level companies could benefit from trying something like this within their respective product teams.

These were the three products I visited:

  • Google Images
  • Organizing information on Search
  • Cameos on Google (Video & Influencers)

In my opinion, this was one of the best parts of the day because it gave me a different perspective about Search as a product Google has operating in the market. In speaking with the product leads, I realized they have a completely different perspective about their product than I have as a marketer and SEO professional.

Obviously, the product leads could not divulge on any specific ranking tactics (I also tried to be respectful and not ask those types of pointed questions) but what I found interesting was that they were each very much focused on their own product area and didn’t have much knowledge, if at all, about Google’s illusive algorithm. I think what you’ll find interesting below in the product cards are the areas of the product’s improvement, goals and impact they are focused on.

Screen Shot 2019-11-10 at 4.42.17 PM.png

 

Product Card: Google Images

As a product manager at Macy’s, I found the image optimization best practices (below) to be particularly relevant for our PDP pages. Namely:

  • Use descriptive titles, captions and filenames.
  • Use high quality images as well as beautiful inspirational images.

The insight from the product lead I picked up on was about user behavior; it seems Google is noticing users are coming to the Image Search tab to find web pages. Which means it’s likely that websites can garner organic traffic by following these optimizing tips and using great images.

Here’s part of the Twitter thread on this topic:

Screen Shot 2019-11-10 at 11.35.31 AM.png

Screen Shot 2019-11-10 at 4.49.29 PM.png

 

Product Card: Organizing the SERP (Search Engine Results Page) 

To me this product is particularly interesting because it speaks to how elements get ranked in the SERP, ostensibly how the algorithm surfaces the most relevant results and of the highest quality.

In speaking with the product owner, I noticed he didn’t quite seem to understand my questions about “as a product owner, how do you determine which element is first, second etc.?” Ok so maybe I was getting too close to the secret ranking sauce 😉

I think the operative word on the product card is “organize” and the emphasis is on “organize the search page by intent”. That’s VERY user-focused. Instead of operating from the POV of “which web page best matches the search query?” it focuses on laying out information based on intention. To that end, I was fascinated to see these product impacts:

  • Increase page utility
  • Improved page scannability

So much of SEO is focused on securing placement above the fold. Rightfully so, because that’s largely where the majority of clicks come from. However, from a product perspective, it seems Google is much more zoomed out on the problem of how to organize information; their focus is holistically arranging the mobile page so that the experience creates better usability, scannability and reduces friction. We tend to think users don’t scroll but the product lead specifically mentioned scrolling is an inherent user behavior on mobile devices. Here’s hoping more users start clicking on results that might be at the bottom of the page.

In my opinion, the Highlights section should have mentioned the recent addition to the algorithm known as BERT (Bidirectional Encoder Representations from Transformers) a state-of-the-art language model used on Natural Language Processing. It’s now being used for 10% of searches to get better context and understand more ambiguous searches. Metaphorically speaking, the distinguished engineer who spoke about how Google understands synonyms referred to it as the rising tide that lifts all ships.

Further reading: BERT explained

Screen Shot 2019-11-10 at 5.00.23 PM.png

 

Product Card: Cameos

This is a new video and influencer-focused product Google is developing called Cameos. It’s an app that can be downloaded. Google is currently beta testing this feature with celebrities and influencers.

How it works: “Experts record video answers to top-searched questions about themselves, their work, and the topics they’re knowledgeable about (e.g. cooking, fitness) …”

I can see this being a great and useful product for recognizable personalities. It made me begin to wonder if they could expand the sphere of expertise to include the self-proclaimed influencers that might not be movie star celebrities but who have expertise and experience. I wondered how Google might go about determining E-A-T (Expertise, Authority, Trustworthy) in this new product.

Screen Shot 2019-11-10 at 5.07.23 PM.png

 

Improving Search over the Years

 A distinguished engineer spoke on this topic, primarily on synonyms and natural language. Search has evolved largely because of how people search for things in text (queries) and how they speak questions into a smart device (voice search).

Screen Shot 2019-11-09 at 7.15.16 PM.png

I took this picture because the last bullet on this slide intrigued me: “Google’s Synonyms System: One of Google Search’s most important ranking components”. Why is this important? In my opinion, it means the algorithm really does factor in relevance against each query.

A few highlights from his talk:

  • Using BERT for 10% of search. It can solve some language related tasks better than most. Helps disambiguate on longer queries
  • How do you decide which words are synonyms for each other? It comes down to G’s evaluation process: A/B testing and search raters.
  • Compositional brand terms – determined by user traffic. G thinks of those as strings & watches to see what users do next.
  • People use emoji all the time. But often don’t know what they mean.
  • We treat all characters we see in links as full characters (first class citizens), even emoji’s

In summary, he encouraged webmasters to write naturally and write for humans.

 

Conclusion

What was the most important thing you learned and how will you implement it?

This question came from the feedback form Google sent out. My answer:

Learning that every change (i.e. algo adjustment) has wins & losses gave me a new perspective and made me more empathetic to search as a product. I’m going to try to help evangelize to SEOs & marketers to write for humans not what they think the search engine will reward.

The good news is, the Google Webmaster team aims to create more venues like this one for feedback in the future. I hope more folks get to experience what I did that day.

 

The opinions, thoughts and perspective expressed in this post are my own. While I am a representative of the company, these are not necessarily the views of my employer.

Diagnosing A Drop In Traffic: 6 Data Sources to Check & Why

It’s a common scenario for an SEO Manager. You come into the office one morning, open up your SEO dashboards and notice a large drop in traffic to a core product page for your main software product.  It seemed to happen almost over night.

How do you go about diagnosing the issue?

What are some possible hypotheses for what could have caused the drop?

Before we get too far into the details. Let’s set the stage for context. Let’s suppose you do SEO for an enterprise software company that has a suite of products for developers and content managers. Your company services over 130,000 customers worldwide.  There are teams similar to yours all over the world in your primary market (English speaking) countries, i.e., Canada, Germany and Australia.

The US market represents the largest of your target markets and the .com site is ideally the one you want ranking in search engines. The size of the website on the same level as the IBM, Microsoft and Salesforce’s of the SaaS world. 

You’re an experienced SEO, and after some digging, you realize that several teams in different countries have published content that is stealing traffic away from the core US product page and this practice could also impact the rankings and performance of other products in the future.

What do you need to do to course correct with cross functional teams?

How do you go about educating teams on how to avoid this kind of issue in the future?

There is a lot to think about here. And you’ve only just finished your first cup of coffee.

Where to start looking  

First things first, diagnosing a drop in traffic means looking at a handful of data sources and formulating a hypothesis. Here are 6 areas where SEO’s should begin looking for clues, what to look for and why it’s relevant to SEO: 

  1. Google Analytics
    • Look for: Which type of traffic declined: organic, direct or referral? Did other, similar pages on the site loose rankings within this same timeframe or is this an isolated incident? 
    • SEO relevance: Knowing which type of traffic source has declined means you’ll be able to back track to the source of the issue. A decline in Referral traffic, for example, may mean some links have been broken on referral websites. A decline in Organic traffic is harder to diagnose but it largely means the source is less visibility of URLs in the SERP. 
  2. Google Search Console 
    • Look for:  Changes to the page in the Performance section. Look at data on Pages, Position, and Search Appearance. Are there any new warnings for this page that Google is flagging for you?
    • SEO relevance: This is largely where webmasters can  “communicate” with Google about their website performance so making sure primary elements like an HTML Sitemap are still visible and up to date are important to double check. 
  3. Development teams & Robots.txt
    • Look for: Ask internal dev teams when the page was last updated or scheduled to be updated (whether that’s tracking pixels, HTML code or on-page content). What was the last team that made updates to the page? Additionally, speak with the dev team leads to confirm nothing changed with the Robots.txt.
    • SEO relevance: Webmasters use the Robots.txt command to communicate crawl instructions to web robots. Bots are either allowed or disallowed from crawling the various folders of the site.  If the page accidentally were disallowed from being crawled, that’s a factor that would impact its rankings.   
  4. Enterprise SEO software tools (Searchmetrics, Conductor, BrightEdge)
    • Look for: Indications of other URLs that have begun ranking for the same term(s) that were previously ranking on your page that lost rankings. Areas like: 
    • Winner/Loser Keywords: what terms were ranking on the page before and after it lost rankings? It could be that a new, better page is in the index that Google is favoring.
    • Overall Content Relevance (E-A-T):  Some tools can measure how relevant the content is against multiple, similar pages in Google’s index. If there were any on-page content changes made recently, it’s worth investigating since it’s possible the changes were ultimately not helpful to users and the page is now underperforming. 
    • URL Rankings: Are there similar pages on the website (even other versions by country) that are cannibalizing rankings of this page? This happens frequently with large, enterprise sites. 
    • Crawl data & log files: Run a crawl to get an idea of what Google is seeing (or not seeing) on the page. Inspect log files for additional insights if available.  Also, check the backlink profile of the page. It’s possible the URL is suffering from spam and/or malicious backlinks dragging down page authority. 
  5. Competitor site performance
    • Look for: Have similar pages on your competitor sites lost rankings too? It could mean aspects of an algorithm update are affecting your industry vertical or that Google decided the page was no longer relevant for certain search queries. 
    • SEO relevance: The search intent of users changes over time. That’s why it’s important to update content to be relevant to the nature of what people are searching for and to continue improving upon existing content.  
  6. Search Engine Results Page (SERP)
    • Look for: Do a manual check to see if there new features in the mobile and/or desktop SERP (whichever device you care about getting traffic from). New elements like a carousel, more images can cause organic listings to be pushed further down the first page. 
    • SEO relevance: Simply put, Google is always looking to surface results that are based on what users want; “Google Images accounts for more than 20% of all queries Americans performed in 2018, and that’s down from a high of nearly 30% three years ago.” Research from SparkToro on 2018 search market share.

Hypothesis – Why the drop happened 

What are some possible hypothesis for the drop in traffic? 

On the whole, there are basically two sources where rankings are affected: internal and external. 

Some of the most common reasons drops in organic traffic occur are the result of external changes such as an algorithm update or the SERP landscape changing (i.e images are prioritized).  Competitor pages can steal rankings if they begin occupying better (higher) positions on the first page. Lastly, a page or site can drop if Google suspects questionable ways of gaining rankings (black or gray hat SEO tactics) are being used (this is called a Manual Penalty).

Conversely, rankings can be affected by internal factors like manual changes made to the page by different product, development or content teams. Internal page cannibalization can occur where other, existing web pages within the same domain outperform the page you want to be ranking. 

Both internal and external factors should be considered and evaluated in order to identify the source and best solution.

It’s common for enterprise level organizations to have many teams interacting with the website at any given time. Which is why it’s important for SEO leads to have open lines of communication and relationships with all teams to quickly address any imbalances.

How to prevent future drops in traffic

In this scenario, the drop was caused by internal teams where different country pages were cannibalizing rankings of the US .com page.  

How can an SEO lead go about educating cross functional, global, teams in the future to avoid this issue?

One approach is for those in SEO leadership roles to would work with internal team leads on creating an outline as a shared resource document that is referenced (almost like a checklist) by various product, content, design etc. teams when updating and publishing important core pages. The reference sheet should include any important, on-page elements that contribute to rankings and list technical SEO requirements that need to be adhered to. Things like:

  • KWs & Relevant content: listing the primary and secondary keywords/ topics for core pages which contribute to rankings. Identify region-specific content or questions that should be addressed so that the page is relevant to local searches and therefore not competitive with other URLs.
  • Technical SEO: In this case, hreflang tags should be applied to each regional page to distinguish the content meant for each region. 

It’s an ongoing combination of maintaining SEO guideline documents in a shared location, using dashboards to monitor ranking fluctuations, and educating the broader internal organization on SEO best practices to help them become more aware of the things that negatively impact rankings. 

Now, it’s back to scanning the horizon for Pandas and Penguins. 

Over to you, fellow SEOs!

Have you had to navigate a similar experience in your org? What other data sets have you used that helped you diagnose and resolve a traffic loss issue? Or, what did I miss in my list that can be added here?  Let me know by commenting below. 

Are Enterprise SEO’s a Dying Breed?

Imagine you’re a physician. You’re traveling home on a flight back from a week-long conference where you had to renew your certification. You met many new and old connections and came away knowing your industry is alive and well.  The plane loudly hums along through the air while you review your session notes. Then you begin to hear some commotion from the other passengers a few rows behind you.

One voice. “Can we get her some water?”

Another voice. “She’s having trouble breathing…”

The flight attendant call button sounds in the cabin “ding!” You remain seated. Ears pricked up but waiting.

Your eyes are just returning to your notes when the pilot comes over the loud speaker, “Sorry for the disturbance folks. If there is a doctor on board, please make yourself known to a flight attendant.”

Out of commitment to your field, you are obligated to get involved. Out of personal passion, you have chosen this field. Either way, you are required to help and try to restore that human being back to health. And because of this, people listen to you.

I often feel like I’m a doctor making as many helpful recommendations as I can when it comes to corporate SEO initiatives. But there are so many different parties involved; it can be hard to meet everyone’s needs equally – time involved, level of effort, impact on improving organic traffic, all while staying on top of industry fluctuations. For such improvements to make an impact site-wide, it takes a village.

My parents are both in the medical field. When I was young, I was actually dissuaded from becoming a doctor. But I still have this inherent desire to help and to fix things.

When I hear digital challenges like “why did organic traffic drop on this date,” or “why are these pages not converting” I like the investigation. I thrive on it.  I look at the symptoms the website or a page is exhibiting and I try to gauge that against what I know of Google’s standard for user experience and content that’s relevant to the intent behind the search query.

But I have to be careful not to go too deep down the rabbit hole on what factors might be the cause of the issue. Today, the algorithms are working in real time and we can never be fully confident in the knowledge that a single factor is the cause.

Which is why, we as SEO’s make recommendations to the best of our knowledge, we test and we watch. If the patient (website) improves, we know we addressed the right aspect of the problem. This is why SEO is a long term game. There are no shortcuts to quality. It’s an investment in the right things making sure you empower other teams to help you along the way.

“There is a new breed of SEO manager who is politically savvy and gifted at collaborating with and mobilizing non-SEO teams. If SEO-integration isn’t on your roadmap, you’d better hope it’s not on your competitors’ maps either–otherwise they’ll have gold, and you won’t.”  The Executive SEO Playbook, by Jessica Bowman

Why do doctors never give up? Because they care. And it might also have something to do with taking a Hippocratic Oath 😉

How can enterprise-level SEO’s be as effective? My prescription is the following:

  1. Have more productive SEO-based conversations with stakeholders.
  2. Make SEO easy to implement and actionable for each team.
  3. Foster connections with other trusted, in-house SEO’s and seek their advice regularly.
  4. Read Jessica’s book!