AI: The Shiny New Object in Search

It’s apparently a new day in Search. Why? AI (Artificial Intelligence) is officially integrated into two notable search engines, Google and (the new) Bing.


I finally had a chance to digest some of the headlines and read a few industry articles on the latest shiny new object in SEO: AI in Search. Here’s an overview of what I think is interesting and will be the trends to watch.

Interestingly, in Google’s attempt to beat Bing to the punch announcing their AI product it came off as the product being rushed to market and the presentation itself wasn’t as well coordinated as it should have been resulting in the company stock tanking.

Alphabet lost about $100 billion in value after a demo meant to show off the AI-powered chatbot bungled its response on the NASA telescope.

https://www.cnet.com

That is not a headline any investor wants to wake up to.

For context, Google announced on February 6th the introduction of Bard to Search. At first, based on the location of AI in their SERP, it appeared to be a revamped version of a Featured Snippet, an organic feature that sits at the top of the results page.

What is the purpose of artificial intelligence?

Google’s product announcement made me wonder what queries they were trying to target. What problem were they trying to solve? Reading through their announcement, though, I think this key concept completely flew under the radar:

“AI can be helpful in these moments, synthesizing insights for questions where there’s no one right answer. Soon, you’ll see AI-powered features in Search that distill complex information and multiple perspectives into easy-to-digest formats, so you can quickly understand the big picture and learn more from the web”

Google AI Search announcement

Still, it wasn’t until I read Brody Clarke’s article where he talked about “NORA (No One Right Answer)” that I realized incorporating AI in search is trying to solve for informational searches with no one right answer.

What is artificial intelligence in simple words? It’s an Ask-Me-Anything machine.

Sample AI question prompts from the New Bing homepage.

A tale of two products – Google Bard & Bing AI Chat

Google Bard: Citations – Out, damned spot; out, I say! —One, two: why then, ’tis time to do’t.

Bard…Shakespeare. Get it?

The key drawback of how AI is being presented, at least initially in Google’s interface, is that the AI responses do not provide clear links or citations to web publishers. Glenn Gable’s article points this out as “Google’s war against publishers” and he’s not wrong. But I think the behavior we’ve seen with Featured Snippets tells a similar story because there was a similar argument with the emergence of zero click search results: “if I can see the answer in the SERP, why do I need to click on the result?” Everyone was worried about their organic traffic and CTR when zero click results emerged.

Over time, earning the Featured Snippet spot did result in some organic traffic as those consumers who desired to learn more clicked through. Gabe added, “It’s also worth noting that Google has not answered any questions about
citing sources. And I mean literally nothing has come from anyone at Google about linking to publishers (which makes me think they were unprepared for the question). That’s also scary…so we’ll see where this ends up.”

That should have been a core tenant of the product. It’s very strange that no one commented on citations.

On a separate but related note, what’s come along with the announcement of AI driven content is speculation that people think their jobs will be taken away because AI will create all of this incredible content at scale. But that’s a fallacy.

AI responds to prompts from humans and can only learn from what humans teach it or feed it. It can’t think for itself. And it’s prone to inaccuracies and biases. AI generated content needs and will continue to need human oversight.

Lily Ray makes a solid point about the delay with information used to train AI content generation tools in her article:

“For example, ChatGPT was trained on data ending in late 2021, although the tool does appear to be improving to reflect more recent information. Given the importance of fresh information in so many areas of SEO, this is a significant limitation for the tools to be able to produce entirely helpful content.” 

In relation to the SEO community, the guidance Google emphasized about content creation was that it comes down to the quality of the product. AI can be involved in generating content (there’s no penalty for this), but if it isn’t helpful to users, it will get weeded out by the systems & algorithms in place (i.e. Helpful Content Update).

Marie Hayne’s newsletter highlights this as well: “It is perfectly fine to use AI in your content creation efforts as long as your end product demonstrates E-E-A-T and is helpful to people. However, Google does not recommend you list AI as the author of your content.”

The extra “E” is for Experience.

Bing AI Chat contains citations (peasants rejoice!)

The next day, Microsoft had a huge event in Seattle where they showcased their AI Search product. Their implementation was well thought out and wow’ed the media. Bing’s AI chat attributed the information to sites where it “learned” about the answers it was providing.

This is a step in the right direction in terms of product attributes but, without the volume of users Google can drive, it’s small potatoes to publishers.

What’s interesting about the approach Bing is taking, in their POV, is that it’s time for a new approach. Barry Schwartz’s attended the announcement and posted about it on Twitter noting:

  • Bing will launch an all new search engine with AI powered. It’s better, will answer questions, you can chat with it and it can create content for you.
  • 40% of all queries result in someone clicking back and most searches are three keywords or less…
  • he (Yusuf Mehdi) explains navigational, informational and other searches people do today. 50,000 people’s searches go unanswered, which is why it is time for a new approach with search.

Product War: To cite or not to cite

Whether or not annotations are visible in the SERP, IMHO, is irrelevant. The important piece for publishers is being rewarded with organic traffic when the AI result actually clicks through to the publisher’s site. Regardless of whether it’s a Google or Bing search, if the No One Right Answer section from AI doesn’t link directly to the original source(s), that’s when you’ll really have an uprising on your hands.

At that point there’s zero incentive for publishers to invest time and resources creating content if the level of organic traffic as a referral source in your analytics doesn’t deliver.

Speaking of citing sources, what if there’s a scenario where you don’t want AI chat bots learning on your content i.e. subsequently misquoting or using it out of context? I thought that was an interesting consideration seeing this article in the SEO FOMO newsletter by Aleyda Solis: How to Block ChatGPT from learning on your website, by Roger Monti

The technique involves utilizing the Robots.txt but it’s not a guarantee. At the very least, it’s nice to have an idea of how to attempt to do this if such a scenario presents itself. 

Aleyda always does a great job curating her newsletter content. I highly recommend signing up.

AI trends to watch

  1. Product dominance will win.
    Will Bing’s AI product tip the scales in their favor where consumer behavior will change and more users will start with Bing vs. Google? IMO, I don’t yet
    think this is likely for a 10-20 years because, as Gabe pointed out, “Google typically drives exponentially more traffic than Bing” (his post offered a handful of comparisons showing the difference between GSC and BWT Clicks & Impressions data). The adoption curve is still too great.
  2. Being second best in the market means more room to test & iterate.
    Microsoft has multiple sources of revenue and Search is a relatively small one at this point. They have the advantage at this point because they can afford to “test and learn” their way from the product they launched this month to a more refined, mature product. Whereas Google has to worry about its impact to Search; bigger changes are riskier because they have the lion’s share of the market.
  3. There will be short term & long term effects of AI – and they’ll be very different.
    Gabe’s idea that AI is headed towards manifesting “Jarvis for Search” (as in the AI assistant created by Tony Stark) got me thinking. I think it would be incredible because Jarvis is Stark’s right hand and intuitively knows what,
    when and how he needs answers or action. But I’d only be on board if it operated in a closed network; can you imagine your Jarvis inundating you with advertising or spam?

Final thoughts

IMHO, it’s one thing for Bing to say it wants to provide links to publishers in their AI chat.

But the bigger question is “will more people use Bing as opposed to Google?”

User behavior is a completely different animal; influencing it…changing it…creating new habits…that takes a long time. In 2023, Bing is not the current market leader when it comes to search engines. Google is. And by a very large margin.

Worldwide desktop market share of leading search engines from January 2015 to December 2022. Source: Statista 2023

Hypothetically, even if Bing has a superior search product with the addition of AI, it’s still all about adoption and preference and doing what’s easiest.

The consumer POV is: “What app is on my phone that I feel gives me the fastest, most reliable, accurate results?” That wins every time.

As far as No One Right Answer results go, take it with a grain of salt. These AI products were basically rushed to market and there’s a long way to go towards improving their outputs.

The startling news just keeps coming:


For the record, none of this was written using AI. I used my brain, my experience as an SEO professional and a keyboard to write out my thoughts.

The information contained in this post does not reflect the views of my employer.

Diagnosing A Drop In Traffic: 6 Data Sources to Check & Why

It’s a common scenario for an SEO Manager. You come into the office one morning, open up your SEO dashboards and notice a large drop in traffic to a core product page for your main software product.  It seemed to happen almost over night.

How do you go about diagnosing the issue?

What are some possible hypotheses for what could have caused the drop?

Before we get too far into the details. Let’s set the stage for context. Let’s suppose you do SEO for an enterprise software company that has a suite of products for developers and content managers. Your company services over 130,000 customers worldwide.  There are teams similar to yours all over the world in your primary market (English speaking) countries, i.e., Canada, Germany and Australia.

The US market represents the largest of your target markets and the .com site is ideally the one you want ranking in search engines. The size of the website on the same level as the IBM, Microsoft and Salesforce’s of the SaaS world. 

You’re an experienced SEO, and after some digging, you realize that several teams in different countries have published content that is stealing traffic away from the core US product page and this practice could also impact the rankings and performance of other products in the future.

What do you need to do to course correct with cross functional teams?

How do you go about educating teams on how to avoid this kind of issue in the future?

There is a lot to think about here. And you’ve only just finished your first cup of coffee.

Where to start looking  

First things first, diagnosing a drop in traffic means looking at a handful of data sources and formulating a hypothesis. Here are 6 areas where SEO’s should begin looking for clues, what to look for and why it’s relevant to SEO: 

  1. Google Analytics
    • Look for: Which type of traffic declined: organic, direct or referral? Did other, similar pages on the site loose rankings within this same timeframe or is this an isolated incident? 
    • SEO relevance: Knowing which type of traffic source has declined means you’ll be able to back track to the source of the issue. A decline in Referral traffic, for example, may mean some links have been broken on referral websites. A decline in Organic traffic is harder to diagnose but it largely means the source is less visibility of URLs in the SERP. 
  2. Google Search Console 
    • Look for:  Changes to the page in the Performance section. Look at data on Pages, Position, and Search Appearance. Are there any new warnings for this page that Google is flagging for you?
    • SEO relevance: This is largely where webmasters can  “communicate” with Google about their website performance so making sure primary elements like an HTML Sitemap are still visible and up to date are important to double check. 
  3. Development teams & Robots.txt
    • Look for: Ask internal dev teams when the page was last updated or scheduled to be updated (whether that’s tracking pixels, HTML code or on-page content). What was the last team that made updates to the page? Additionally, speak with the dev team leads to confirm nothing changed with the Robots.txt.
    • SEO relevance: Webmasters use the Robots.txt command to communicate crawl instructions to web robots. Bots are either allowed or disallowed from crawling the various folders of the site.  If the page accidentally were disallowed from being crawled, that’s a factor that would impact its rankings.   
  4. Enterprise SEO software tools (Searchmetrics, Conductor, BrightEdge)
    • Look for: Indications of other URLs that have begun ranking for the same term(s) that were previously ranking on your page that lost rankings. Areas like: 
    • Winner/Loser Keywords: what terms were ranking on the page before and after it lost rankings? It could be that a new, better page is in the index that Google is favoring.
    • Overall Content Relevance (E-A-T):  Some tools can measure how relevant the content is against multiple, similar pages in Google’s index. If there were any on-page content changes made recently, it’s worth investigating since it’s possible the changes were ultimately not helpful to users and the page is now underperforming. 
    • URL Rankings: Are there similar pages on the website (even other versions by country) that are cannibalizing rankings of this page? This happens frequently with large, enterprise sites. 
    • Crawl data & log files: Run a crawl to get an idea of what Google is seeing (or not seeing) on the page. Inspect log files for additional insights if available.  Also, check the backlink profile of the page. It’s possible the URL is suffering from spam and/or malicious backlinks dragging down page authority. 
  5. Competitor site performance
    • Look for: Have similar pages on your competitor sites lost rankings too? It could mean aspects of an algorithm update are affecting your industry vertical or that Google decided the page was no longer relevant for certain search queries. 
    • SEO relevance: The search intent of users changes over time. That’s why it’s important to update content to be relevant to the nature of what people are searching for and to continue improving upon existing content.  
  6. Search Engine Results Page (SERP)
    • Look for: Do a manual check to see if there new features in the mobile and/or desktop SERP (whichever device you care about getting traffic from). New elements like a carousel, more images can cause organic listings to be pushed further down the first page. 
    • SEO relevance: Simply put, Google is always looking to surface results that are based on what users want; “Google Images accounts for more than 20% of all queries Americans performed in 2018, and that’s down from a high of nearly 30% three years ago.” Research from SparkToro on 2018 search market share.

Hypothesis – Why the drop happened 

What are some possible hypothesis for the drop in traffic? 

On the whole, there are basically two sources where rankings are affected: internal and external. 

Some of the most common reasons drops in organic traffic occur are the result of external changes such as an algorithm update or the SERP landscape changing (i.e images are prioritized).  Competitor pages can steal rankings if they begin occupying better (higher) positions on the first page. Lastly, a page or site can drop if Google suspects questionable ways of gaining rankings (black or gray hat SEO tactics) are being used (this is called a Manual Penalty).

Conversely, rankings can be affected by internal factors like manual changes made to the page by different product, development or content teams. Internal page cannibalization can occur where other, existing web pages within the same domain outperform the page you want to be ranking. 

Both internal and external factors should be considered and evaluated in order to identify the source and best solution.

It’s common for enterprise level organizations to have many teams interacting with the website at any given time. Which is why it’s important for SEO leads to have open lines of communication and relationships with all teams to quickly address any imbalances.

How to prevent future drops in traffic

In this scenario, the drop was caused by internal teams where different country pages were cannibalizing rankings of the US .com page.  

How can an SEO lead go about educating cross functional, global, teams in the future to avoid this issue?

One approach is for those in SEO leadership roles to would work with internal team leads on creating an outline as a shared resource document that is referenced (almost like a checklist) by various product, content, design etc. teams when updating and publishing important core pages. The reference sheet should include any important, on-page elements that contribute to rankings and list technical SEO requirements that need to be adhered to. Things like:

  • KWs & Relevant content: listing the primary and secondary keywords/ topics for core pages which contribute to rankings. Identify region-specific content or questions that should be addressed so that the page is relevant to local searches and therefore not competitive with other URLs.
  • Technical SEO: In this case, hreflang tags should be applied to each regional page to distinguish the content meant for each region. 

It’s an ongoing combination of maintaining SEO guideline documents in a shared location, using dashboards to monitor ranking fluctuations, and educating the broader internal organization on SEO best practices to help them become more aware of the things that negatively impact rankings. 

Now, it’s back to scanning the horizon for Pandas and Penguins. 

Over to you, fellow SEOs!

Have you had to navigate a similar experience in your org? What other data sets have you used that helped you diagnose and resolve a traffic loss issue? Or, what did I miss in my list that can be added here?  Let me know by commenting below. 

Are Enterprise SEO’s a Dying Breed?

Imagine you’re a physician. You’re traveling home on a flight back from a week-long conference where you had to renew your certification. You met many new and old connections and came away knowing your industry is alive and well.  The plane loudly hums along through the air while you review your session notes. Then you begin to hear some commotion from the other passengers a few rows behind you.

One voice. “Can we get her some water?”

Another voice. “She’s having trouble breathing…”

The flight attendant call button sounds in the cabin “ding!” You remain seated. Ears pricked up but waiting.

Your eyes are just returning to your notes when the pilot comes over the loud speaker, “Sorry for the disturbance folks. If there is a doctor on board, please make yourself known to a flight attendant.”

Out of commitment to your field, you are obligated to get involved. Out of personal passion, you have chosen this field. Either way, you are required to help and try to restore that human being back to health. And because of this, people listen to you.

I often feel like I’m a doctor making as many helpful recommendations as I can when it comes to corporate SEO initiatives. But there are so many different parties involved; it can be hard to meet everyone’s needs equally – time involved, level of effort, impact on improving organic traffic, all while staying on top of industry fluctuations. For such improvements to make an impact site-wide, it takes a village.

My parents are both in the medical field. When I was young, I was actually dissuaded from becoming a doctor. But I still have this inherent desire to help and to fix things.

When I hear digital challenges like “why did organic traffic drop on this date,” or “why are these pages not converting” I like the investigation. I thrive on it.  I look at the symptoms the website or a page is exhibiting and I try to gauge that against what I know of Google’s standard for user experience and content that’s relevant to the intent behind the search query.

But I have to be careful not to go too deep down the rabbit hole on what factors might be the cause of the issue. Today, the algorithms are working in real time and we can never be fully confident in the knowledge that a single factor is the cause.

Which is why, we as SEO’s make recommendations to the best of our knowledge, we test and we watch. If the patient (website) improves, we know we addressed the right aspect of the problem. This is why SEO is a long term game. There are no shortcuts to quality. It’s an investment in the right things making sure you empower other teams to help you along the way.

“There is a new breed of SEO manager who is politically savvy and gifted at collaborating with and mobilizing non-SEO teams. If SEO-integration isn’t on your roadmap, you’d better hope it’s not on your competitors’ maps either–otherwise they’ll have gold, and you won’t.”  The Executive SEO Playbook, by Jessica Bowman

Why do doctors never give up? Because they care. And it might also have something to do with taking a Hippocratic Oath 😉

How can enterprise-level SEO’s be as effective? My prescription is the following:

  1. Have more productive SEO-based conversations with stakeholders.
  2. Make SEO easy to implement and actionable for each team.
  3. Foster connections with other trusted, in-house SEO’s and seek their advice regularly.
  4. Read Jessica’s book!