June 28, 2018

What Do SEOs Do When Google Removes Organic Search Traffic? – Whiteboard Friday

Posted by randfish

We rely pretty heavily on Google, but some of their decisions of late have made doing SEO more difficult than it used to be. Which organic opportunities have been taken away, and what are some potential solutions? Rand covers a rather unsettling trend for SEO in this week's Whiteboard Friday.

What Do SEOs Do When Google Removes Organic Search?

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're talking about something kind of unnerving. What do we, as SEOs, do as Google is removing organic search traffic?

So for the last 19 years or 20 years that Google has been around, every month Google has had, at least seasonally adjusted, not just more searches, but they've sent more organic traffic than they did that month last year. So this has been on a steady incline. There's always been more opportunity in Google search until recently, and that is because of a bunch of moves, not that Google is losing market share, not that they're receiving fewer searches, but that they are doing things that makes SEO a lot harder.

Some scary news

Things like...

  • Aggressive "answer" boxes. So you search for a question, and Google provides not just necessarily a featured snippet, which can earn you a click-through, but a box that truly answers the searcher's question, that comes directly from Google themselves, or a set of card-style results that provides a list of all the things that the person might be looking for.
  • Google is moving into more and more aggressively commercial spaces, like jobs, flights, products, all of these kinds of searches where previously there was opportunity and now there's a lot less. If you're Expedia or you're Travelocity or you're Hotels.com or you're Cheapflights and you see what's going on with flight and hotel searches in particular, Google is essentially saying, "No, no, no. Don't worry about clicking anything else. We've got the answers for you right here."
  • We also saw for the first time a seasonally adjusted drop, a drop in total organic clicks sent. That was between August and November of 2017. It was thanks to the Jumpshot dataset. It happened at least here in the United States. We don't know if it's happened in other countries as well. But that's certainly concerning because that is not something we've observed in the past. There were fewer clicks sent than there were previously. That makes us pretty concerned. It didn't go down very much. It went down a couple of percentage points. There's still a lot more clicks being sent in 2018 than there were in 2013. So it's not like we've dipped below something, but concerning.
  • New zero-result SERPs. We absolutely saw those for the first time. Google rolled them back after rolling them out. But, for example, if you search for the time in London or a Lagavulin 16, Google was showing no results at all, just a little box with the time and then potentially some AdWords ads. So zero organic results, nothing for an SEO to even optimize for in there.
  • Local SERPs that remove almost all need for a website. Then local SERPs, which have been getting more and more aggressively tuned so that you never need to click the website, and, in fact, Google has made it harder and harder to find the website in both mobile and desktop versions of local searches. So if you search for Thai restaurant and you try and find the website of the Thai restaurant you're interested in, as opposed to just information about them in Google's local pack, that's frustratingly difficult. They are making those more and more aggressive and putting them more forward in the results.

Potential solutions for marketers

So, as a result, I think search marketers really need to start thinking about: What do we do as Google is taking away this opportunity? How can we continue to compete and provide value for our clients and our companies? I think there are three big sort of paths — I won't get into the details of the paths — but three big paths that we can pursue.

1. Invest in demand generation for your brand + branded product names to leapfrog declines in unbranded search.

The first one is pretty powerful and pretty awesome, which is investing in demand generation, rather than just demand serving, but demand generation for brand and branded product names. Why does this work? Well, because let's say, for example, I'm searching for SEO tools. What do I get? I get back a list of results from Google with a bunch of mostly articles saying these are the top SEO tools. In fact, Google has now made a little one box, card-style list result up at the top, the carousel that shows different brands of SEO tools. I don't think Moz is actually listed in there because I think they're pulling from the second or the third lists instead of the first one. Whatever the case, frustrating, hard to optimize for. Google could take away demand from it or click-through rate opportunity from it.

But if someone performs a search for Moz, well, guess what? I mean we can nail that sucker. We can definitely rank for that. Google is not going to take away our ability to rank for our own brand name. In fact, Google knows that, in the navigational search sense, they need to provide the website that the person is looking for front and center. So if we can create more demand for Moz than there is for SEO tools, which I think there's something like 5 or 10 times more demand already for Moz than there is tools, according to Google Trends, that's a great way to go. You can do the same thing through your content, through your social media, and through your email marketing. Even through search you can search and create demand for your brand rather than unbranded terms.

2. Optimize for additional platforms.

Second thing, optimizing across additional platforms. So we've looked and YouTube and Google Images account for about half of the overall volume that goes to Google web search. So between these two platforms, you've got a significant amount of additional traffic that you can optimize for. Images has actually gotten less aggressive. Right now they've taken away the "view image directly" link so that more people are visiting websites via Google Images. YouTube, obviously, this is a great place to build brand affinity, to build awareness, to create demand, this kind of demand generation to get your content in front of people. So these two are great platforms for that.

There are also significant amounts of web traffic still on the social web — LinkedIn, Facebook, Twitter, Pinterest, Instagram, etc., etc. The list goes on. Those are places where you can optimize, put your content forward, and earn traffic back to your websites.

3. Optimize the content that Google does show.

Local

So if you're in the local space and you're saying, "Gosh, Google has really taken away the ability for my website to get the clicks that it used to get from Google local searches," going into Google My Business and optimizing to provide information such that people who perform that query will be satisfied by Google's result, yes, they won't get to your website, but they will still come to your business, because you've optimized the content such that Google is showing, through Google My Business, such that those searchers want to engage with you. I think this sometimes gets lost in the SEO battle. We're trying so hard to earn the click to our site that we're forgetting that a lot of search experience ends right at the SERP itself, and we can optimize there too.

Results

In the zero-results sets, Google was still willing to show AdWords, which means if we have customer targets, we can use remarketed lists for search advertising (RLSA), or we can run paid ads and still optimize for those. We could also try and claim some of the data that might show up in zero-result SERPs. We don't yet know what that will be after Google rolls it back out, but we'll find out in the future.

Answers

For answers, the answers that Google is giving, whether that's through voice or visually, those can be curated and crafted through featured snippets, through the card lists, and through the answer boxes. We have the opportunity again to influence, if not control, what Google is showing in those places, even when the search ends at the SERP.

All right, everyone, thanks for watching for this edition of Whiteboard Friday. We'll see you again next week. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

June 27, 2018

The Minimum Viable Knowledge You Need to Work with JavaScript & SEO Today

Posted by sergeystefoglo

If your work involves SEO at some level, you’ve most likely been hearing more and more about JavaScript and the implications it has on crawling and indexing. Frankly, Googlebot struggles with it, and many websites utilize modern-day JavaScript to load in crucial content today. Because of this, we need to be equipped to discuss this topic when it comes up in order to be effective.

The goal of this post is to equip you with the minimum viable knowledge required to do so. This post won’t go into the nitty gritty details, describe the history, or give you extreme detail on specifics. There are a lot of incredible write-ups that already do this — I suggest giving them a read if you are interested in diving deeper (I’ll link out to my favorites at the bottom).

In order to be effective consultants when it comes to the topic of JavaScript and SEO, we need to be able to answer three questions:

  1. Does the domain/page in question rely on client-side JavaScript to load/change on-page content or links?
  2. If yes, is Googlebot seeing the content that’s loaded in via JavaScript properly?
  3. If not, what is the ideal solution?

With some quick searching, I was able to find three examples of landing pages that utilize JavaScript to load in crucial content.

I’m going to be using Sitecore’s Symposium landing page through each of these talking points to illustrate how to answer the questions above.

We’ll cover the “how do I do this” aspect first, and at the end I’ll expand on a few core concepts and link to further resources.

Question 1: Does the domain in question rely on client-side JavaScript to load/change on-page content or links?

The first step to diagnosing any issues involving JavaScript is to check if the domain uses it to load in crucial content that could impact SEO (on-page content or links). Ideally this will happen anytime you get a new client (during the initial technical audit), or whenever your client redesigns/launches new features of the site.

How do we go about doing this?

Ask the client

Ask, and you shall receive! Seriously though, one of the quickest/easiest things you can do as a consultant is contact your POC (or developers on the account) and ask them. After all, these are the people who work on the website day-in and day-out!

“Hi [client], we’re currently doing a technical sweep on the site. One thing we check is if any crucial content (links, on-page content) gets loaded in via JavaScript. We will do some manual testing, but an easy way to confirm this is to ask! Could you (or the team) answer the following, please?

1. Are we using client-side JavaScript to load in important content?
2. If yes, can we get a bulleted list of where/what content is loaded in via JavaScript?”

Check manually

Even on a large e-commerce website with millions of pages, there are usually only a handful of important page templates. In my experience, it should only take an hour max to check manually. I use the Chrome Web Developers plugin, disable JavaScript from there, and manually check the important templates of the site (homepage, category page, product page, blog post, etc.)

In the example above, once we turn off JavaScript and reload the page, we can see that we are looking at a blank page.

As you make progress, jot down notes about content that isn’t being loaded in, is being loaded in wrong, or any internal linking that isn’t working properly.

At the end of this step we should know if the domain in question relies on JavaScript to load/change on-page content or links. If the answer is yes, we should also know where this happens (homepage, category pages, specific modules, etc.)

Crawl

You could also crawl the site (with a tool like Screaming Frog or Sitebulb) with JavaScript rendering turned off, and then run the same crawl with JavaScript turned on, and compare the differences with internal links and on-page elements.

For example, it could be that when you crawl the site with JavaScript rendering turned off, the title tags don’t appear. In my mind this would trigger an action to crawl the site with JavaScript rendering turned on to see if the title tags do appear (as well as checking manually).

Example

For our example, I went ahead and did a manual check. As we can see from the screenshot below, when we disable JavaScript, the content does not load.

In other words, the answer to our first question for this pages is “yes, JavaScript is being used to load in crucial parts of the site.”

Question 2: If yes, is Googlebot seeing the content that’s loaded in via JavaScript properly?

If your client is relying on JavaScript on certain parts of their website (in our example they are), it is our job to try and replicate how Google is actually seeing the page(s). We want to answer the question, “Is Google seeing the page/site the way we want it to?”

In order to get a more accurate depiction of what Googlebot is seeing, we need to attempt to mimic how it crawls the page.

How do we do that?

Use Google’s new mobile-friendly testing tool

At the moment, the quickest and most accurate way to try and replicate what Googlebot is seeing on a site is by using Google’s new mobile friendliness tool. My colleague Dom recently wrote an in-depth post comparing Search Console Fetch and Render, Googlebot, and the mobile friendliness tool. His findings were that most of the time, Googlebot and the mobile friendliness tool resulted in the same output.

In Google’s mobile friendliness tool, simply input your URL, hit “run test,” and then once the test is complete, click on “source code” on the right side of the window. You can take that code and search for any on-page content (title tags, canonicals, etc.) or links. If they appear here, Google is most likely seeing the content.

Search for visible content in Google

It’s always good to sense-check. Another quick way to check if GoogleBot has indexed content on your page is by simply selecting visible text on your page, and doing a site:search for it in Google with quotations around said text.

In our example there is visible text on the page that reads…

"Whether you are in marketing, business development, or IT, you feel a sense of urgency. Or maybe opportunity?"

When we do a site:search for this exact phrase, for this exact page, we get nothing. This means Google hasn’t indexed the content.

Crawling with a tool

Most crawling tools have the functionality to crawl JavaScript now. For example, in Screaming Frog you can head to configuration > spider > rendering > then select “JavaScript” from the dropdown and hit save. DeepCrawl and SiteBulb both have this feature as well.

From here you can input your domain/URL and see the rendered page/code once your tool of choice has completed the crawl.

Example:

When attempting to answer this question, my preference is to start by inputting the domain into Google’s mobile friendliness tool, copy the source code, and searching for important on-page elements (think title tag, <h1>, body copy, etc.) It’s also helpful to use a tool like diff checker to compare the rendered HTML with the original HTML (Screaming Frog also has a function where you can do this side by side).

For our example, here is what the output of the mobile friendliness tool shows us.

After a few searches, it becomes clear that important on-page elements are missing here.

We also did the second test and confirmed that Google hasn’t indexed the body content found on this page.

The implication at this point is that Googlebot is not seeing our content the way we want it to, which is a problem.

Let’s jump ahead and see what we can recommend the client.

Question 3: If we’re confident Googlebot isn’t seeing our content properly, what should we recommend?

Now we know that the domain is using JavaScript to load in crucial content and we know that Googlebot is most likely not seeing that content, the final step is to recommend an ideal solution to the client. Key word: recommend, not implement. It’s 100% our job to flag the issue to our client, explain why it’s important (as well as the possible implications), and highlight an ideal solution. It is 100% not our job to try to do the developer’s job of figuring out an ideal solution with their unique stack/resources/etc.

How do we do that?

You want server-side rendering

The main reason why Google is having trouble seeing Sitecore’s landing page right now, is because Sitecore’s landing page is asking the user (us, Googlebot) to do the heavy work of loading the JavaScript on their page. In other words, they’re using client-side JavaScript.

Googlebot is literally landing on the page, trying to execute JavaScript as best as possible, and then needing to leave before it has a chance to see any content.

The fix here is to instead have Sitecore’s landing page load on their server. In other words, we want to take the heavy lifting off of Googlebot, and put it on Sitecore’s servers. This will ensure that when Googlebot comes to the page, it doesn’t have to do any heavy lifting and instead can crawl the rendered HTML.

In this scenario, Googlebot lands on the page and already sees the HTML (and all the content).

There are more specific options (like isomorphic setups)

This is where it gets to be a bit in the weeds, but there are hybrid solutions. The best one at the moment is called isomorphic.

In this model, we're asking the client to load the first request on their server, and then any future requests are made client-side.

So Googlebot comes to the page, the client’s server has already executed the initial JavaScript needed for the page, sends the rendered HTML down to the browser, and anything after that is done on the client-side.

If you’re looking to recommend this as a solution, please read this post from the AirBNB team which covers isomorphic setups in detail.

AJAX crawling = no go

I won’t go into details on this, but just know that Google’s previous AJAX crawling solution for JavaScript has since been discontinued and will eventually not work. We shouldn’t be recommending this method.

(However, I am interested to hear any case studies from anyone who has implemented this solution recently. How has Google responded? Also, here’s a great write-up on this from my colleague Rob.)

Summary

At the risk of severely oversimplifying, here's what you need to do in order to start working with JavaScript and SEO in 2018:

  1. Know when/where your client’s domain uses client-side JavaScript to load in on-page content or links.
    1. Ask the developers.
    2. Turn off JavaScript and do some manual testing by page template.
    3. Crawl using a JavaScript crawler.
  2. Check to see if GoogleBot is seeing content the way we intend it to.
    1. Google’s mobile friendliness checker.
    2. Doing a site:search for visible content on the page.
    3. Crawl using a JavaScript crawler.
  3. Give an ideal recommendation to client.
    1. Server-side rendering.
    2. Hybrid solutions (isomorphic).
    3. Not AJAX crawling.

Further resources

I’m really interested to hear about any of your experiences with JavaScript and SEO. What are some examples of things that have worked well for you? What about things that haven’t worked so well? If you’ve implemented an isomorphic setup, I’m curious to hear how that’s impacted how Googlebot sees your site.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

June 27, 2018

The Minimum Viable Knowledge You Need to Work with JavaScript &amp; SEO Today

Posted by sergeystefoglo

If your work involves SEO at some level, you’ve most likely been hearing more and more about JavaScript and the implications it has on crawling and indexing. Frankly, Googlebot struggles with it, and many websites utilize modern-day JavaScript to load in crucial content today. Because of this, we need to be equipped to discuss this topic when it comes up in order to be effective.

The goal of this post is to equip you with the minimum viable knowledge required to do so. This post won’t go into the nitty gritty details, describe the history, or give you extreme detail on specifics. There are a lot of incredible write-ups that already do this — I suggest giving them a read if you are interested in diving deeper (I’ll link out to my favorites at the bottom).

In order to be effective consultants when it comes to the topic of JavaScript and SEO, we need to be able to answer three questions:

  1. Does the domain/page in question rely on client-side JavaScript to load/change on-page content or links?
  2. If yes, is Googlebot seeing the content that’s loaded in via JavaScript properly?
  3. If not, what is the ideal solution?

With some quick searching, I was able to find three examples of landing pages that utilize JavaScript to load in crucial content.

I’m going to be using Sitecore’s Symposium landing page through each of these talking points to illustrate how to answer the questions above.

We’ll cover the “how do I do this” aspect first, and at the end I’ll expand on a few core concepts and link to further resources.

Question 1: Does the domain in question rely on client-side JavaScript to load/change on-page content or links?

The first step to diagnosing any issues involving JavaScript is to check if the domain uses it to load in crucial content that could impact SEO (on-page content or links). Ideally this will happen anytime you get a new client (during the initial technical audit), or whenever your client redesigns/launches new features of the site.

How do we go about doing this?

Ask the client

Ask, and you shall receive! Seriously though, one of the quickest/easiest things you can do as a consultant is contact your POC (or developers on the account) and ask them. After all, these are the people who work on the website day-in and day-out!

“Hi [client], we’re currently doing a technical sweep on the site. One thing we check is if any crucial content (links, on-page content) gets loaded in via JavaScript. We will do some manual testing, but an easy way to confirm this is to ask! Could you (or the team) answer the following, please?

1. Are we using client-side JavaScript to load in important content?
2. If yes, can we get a bulleted list of where/what content is loaded in via JavaScript?”

Check manually

Even on a large e-commerce website with millions of pages, there are usually only a handful of important page templates. In my experience, it should only take an hour max to check manually. I use the Chrome Web Developers plugin, disable JavaScript from there, and manually check the important templates of the site (homepage, category page, product page, blog post, etc.)

In the example above, once we turn off JavaScript and reload the page, we can see that we are looking at a blank page.

As you make progress, jot down notes about content that isn’t being loaded in, is being loaded in wrong, or any internal linking that isn’t working properly.

At the end of this step we should know if the domain in question relies on JavaScript to load/change on-page content or links. If the answer is yes, we should also know where this happens (homepage, category pages, specific modules, etc.)

Crawl

You could also crawl the site (with a tool like Screaming Frog or Sitebulb) with JavaScript rendering turned off, and then run the same crawl with JavaScript turned on, and compare the differences with internal links and on-page elements.

For example, it could be that when you crawl the site with JavaScript rendering turned off, the title tags don’t appear. In my mind this would trigger an action to crawl the site with JavaScript rendering turned on to see if the title tags do appear (as well as checking manually).

Example

For our example, I went ahead and did a manual check. As we can see from the screenshot below, when we disable JavaScript, the content does not load.

In other words, the answer to our first question for this pages is “yes, JavaScript is being used to load in crucial parts of the site.”

Question 2: If yes, is Googlebot seeing the content that’s loaded in via JavaScript properly?

If your client is relying on JavaScript on certain parts of their website (in our example they are), it is our job to try and replicate how Google is actually seeing the page(s). We want to answer the question, “Is Google seeing the page/site the way we want it to?”

In order to get a more accurate depiction of what Googlebot is seeing, we need to attempt to mimic how it crawls the page.

How do we do that?

Use Google’s new mobile-friendly testing tool

At the moment, the quickest and most accurate way to try and replicate what Googlebot is seeing on a site is by using Google’s new mobile friendliness tool. My colleague Dom recently wrote an in-depth post comparing Search Console Fetch and Render, Googlebot, and the mobile friendliness tool. His findings were that most of the time, Googlebot and the mobile friendliness tool resulted in the same output.

In Google’s mobile friendliness tool, simply input your URL, hit “run test,” and then once the test is complete, click on “source code” on the right side of the window. You can take that code and search for any on-page content (title tags, canonicals, etc.) or links. If they appear here, Google is most likely seeing the content.

Search for visible content in Google

It’s always good to sense-check. Another quick way to check if GoogleBot has indexed content on your page is by simply selecting visible text on your page, and doing a site:search for it in Google with quotations around said text.

In our example there is visible text on the page that reads…

"Whether you are in marketing, business development, or IT, you feel a sense of urgency. Or maybe opportunity?"

When we do a site:search for this exact phrase, for this exact page, we get nothing. This means Google hasn’t indexed the content.

Crawling with a tool

Most crawling tools have the functionality to crawl JavaScript now. For example, in Screaming Frog you can head to configuration > spider > rendering > then select “JavaScript” from the dropdown and hit save. DeepCrawl and SiteBulb both have this feature as well.

From here you can input your domain/URL and see the rendered page/code once your tool of choice has completed the crawl.

Example:

When attempting to answer this question, my preference is to start by inputting the domain into Google’s mobile friendliness tool, copy the source code, and searching for important on-page elements (think title tag, <h1>, body copy, etc.) It’s also helpful to use a tool like diff checker to compare the rendered HTML with the original HTML (Screaming Frog also has a function where you can do this side by side).

For our example, here is what the output of the mobile friendliness tool shows us.

After a few searches, it becomes clear that important on-page elements are missing here.

We also did the second test and confirmed that Google hasn’t indexed the body content found on this page.

The implication at this point is that Googlebot is not seeing our content the way we want it to, which is a problem.

Let’s jump ahead and see what we can recommend the client.

Question 3: If we’re confident Googlebot isn’t seeing our content properly, what should we recommend?

Now we know that the domain is using JavaScript to load in crucial content and we know that Googlebot is most likely not seeing that content, the final step is to recommend an ideal solution to the client. Key word: recommend, not implement. It’s 100% our job to flag the issue to our client, explain why it’s important (as well as the possible implications), and highlight an ideal solution. It is 100% not our job to try to do the developer’s job of figuring out an ideal solution with their unique stack/resources/etc.

How do we do that?

You want server-side rendering

The main reason why Google is having trouble seeing Sitecore’s landing page right now, is because Sitecore’s landing page is asking the user (us, Googlebot) to do the heavy work of loading the JavaScript on their page. In other words, they’re using client-side JavaScript.

Googlebot is literally landing on the page, trying to execute JavaScript as best as possible, and then needing to leave before it has a chance to see any content.

The fix here is to instead have Sitecore’s landing page load on their server. In other words, we want to take the heavy lifting off of Googlebot, and put it on Sitecore’s servers. This will ensure that when Googlebot comes to the page, it doesn’t have to do any heavy lifting and instead can crawl the rendered HTML.

In this scenario, Googlebot lands on the page and already sees the HTML (and all the content).

There are more specific options (like isomorphic setups)

This is where it gets to be a bit in the weeds, but there are hybrid solutions. The best one at the moment is called isomorphic.

In this model, we're asking the client to load the first request on their server, and then any future requests are made client-side.

So Googlebot comes to the page, the client’s server has already executed the initial JavaScript needed for the page, sends the rendered HTML down to the browser, and anything after that is done on the client-side.

If you’re looking to recommend this as a solution, please read this post from the AirBNB team which covers isomorphic setups in detail.

AJAX crawling = no go

I won’t go into details on this, but just know that Google’s previous AJAX crawling solution for JavaScript has since been discontinued and will eventually not work. We shouldn’t be recommending this method.

(However, I am interested to hear any case studies from anyone who has implemented this solution recently. How has Google responded? Also, here’s a great write-up on this from my colleague Rob.)

Summary

At the risk of severely oversimplifying, here's what you need to do in order to start working with JavaScript and SEO in 2018:

  1. Know when/where your client’s domain uses client-side JavaScript to load in on-page content or links.
    1. Ask the developers.
    2. Turn off JavaScript and do some manual testing by page template.
    3. Crawl using a JavaScript crawler.
  2. Check to see if GoogleBot is seeing content the way we intend it to.
    1. Google’s mobile friendliness checker.
    2. Doing a site:search for visible content on the page.
    3. Crawl using a JavaScript crawler.
  3. Give an ideal recommendation to client.
    1. Server-side rendering.
    2. Hybrid solutions (isomorphic).
    3. Not AJAX crawling.

Further resources

I’m really interested to hear about any of your experiences with JavaScript and SEO. What are some examples of things that have worked well for you? What about things that haven’t worked so well? If you’ve implemented an isomorphic setup, I’m curious to hear how that’s impacted how Googlebot sees your site.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

June 26, 2018

How to Diagnose Your SEO Client’s Search Maturity

Posted by HeatherPhysioc

One of the biggest mistakes I see (and am guilty of making) is assuming a client is knowledgeable, bought-in, and motivated to execute search work simply because they agreed to pay us to do it. We start trucking full-speed ahead, dumping recommendations in their laps, and are surprised when the work doesn’t get implemented.

We put the cart before the horse. It’s easy to forget that clients start at different points of maturity and knowledge levels about search, and even clients with advanced knowledge may have organizational challenges that create barriers to implementing the work. Identifying where your client falls on a maturity curve can help you better tailor communication and recommendations to meet them where they are, and increase the likelihood that your work will be implemented.

How mature is your client?

No, not emotional maturity. Search practice maturity. This article will present a search maturity model, and provide guidance on how to diagnose where your client falls on that maturity spectrum.

This is where maturity models can help. Originally developed for the Department of Defense, and later popularized by Six Sigma methodologies, maturity models are designed to measure the ability of an organization to continuously improve in a practice. They help you diagnose the current maturity of the business in a certain area, and help identify where to focus efforts to evolve to the next stage on the maturity curve. It’s a powerful tool for meeting the client where they are, and understanding how to move forward together with them.

There are a number of different maturity models you can research online that use different language, but most maturity models follow a pattern something like this:

  • Stage 1 - Ad Hoc & Developing
  • Stage 2 - Reactive & Repeatable
  • Stage 3 - Strategic & Defined
  • Stage 4 - Managed & Measured
  • Stage 5 - Efficient & Optimizing

For search, we can think about a maturity model two ways.

One is the actual technical implementation of search best practices — is the client implementing exceptional, advanced SEO, just the basics, nothing at all, or even operating counterproductively? This can help you figure out what kinds of projects make the most sense to activate.

The second way is the organizational maturity around search engine optimization as a marketing program. Is the client aligned to the importance of organic search, allocating budget and personnel appropriately, and systematically integrating search into marketing efforts? This can help you identify the most important institutional challenges to solve for that can otherwise block the implementation of your work.

Technical SEO capabilities maturity

First, let’s dive into a maturity model for search knowledge and capabilities.

SEO capabilities criteria

We measure an organization on several important criteria that contribute to the success of SEO:

  • Collaboration - how well relevant stakeholders integrate and collaborate to do the best work possible, including inside the organization, and between the organization and the service providers.
  • Mobility - how mobile-friendly and optimized the brand is.
  • Technical - how consistently foundational technical best practices are implemented and maintained.
  • Content - how integrated organic search is into the digital content marketing practice and process.
  • On-page - how limited or extensive on-page optimization is for the brand’s content.
  • Off-page - the breadth and depth of the brand’s off-site optimization, including link-building, local listings, social profiles and other non-site assets.
  • New technology -the appetite for and adoption of new technology that impacts search, such as voice search, AMP, even structured data.
  • Analytics - how data-centric the organization is, ranging from not managed and measured at all, to rearview mirror performance reporting, to entirely data-driven in search decision-making.

Search Capabilities Score Card

Click the image to see the full-size version.

SEO capabilities maturity stages

We assign each of the aforementioned criteria to one of these stages:

  • Stage 0 (Counterproductive) - The client is engaging in harmful or damaging SEO practices.
  • Stage 1 (Nonexistent) - There is no discernible SEO strategy or tactical implementation, and search is an all-new program for the client.
  • Stage 2 (Tactical) - The client may be doing some basic SEO best practices, but it tends to be ad hoc inclusion with little structure or pre-planning. The skills and the work meet minimum industry standards, but work is fairly basic and perhaps not cohesive.
  • Stage 3 (Strategic) - The client is aligned to the value of SEO, and makes an effort to dedicate resources to implementing best practices and staying current, as well as bake it into key initiatives. Search implementation is more cohesive and strategic.
  • Stage 4 (Practice) - Inclusion of SEO is an expectation for most of the client’s marketing initiatives, if not mandatory. They are not only implementing basic best practices but actively testing and iterating new techniques to improve their search presence. They use performance of past initiatives to drive next steps.
  • Stage 5 (Culture) - At this stage, clients are operating as if SEO is part of their marketing DNA. They have resources and processes in place, and they are knowledgeable and committed to learning more, their processes are continually reviewed and optimized, and their SEO program is evolving as the industry evolves. They are seeking cutting-edge new SEO opportunities to test.

Search Capabilities Maturity Model

Click the image to see the full-size version.

While this maturity model has been peer reviewed by a number of respected SEO peers in the industry (special thanks to Kim Jones at Seer Interactive, Stephanie Briggs at Briggsby, John Doherty at Credo, Dan Shure at Evolving SEO, and Blake Denman at Rickety Roo for your time and expertise), it is a fluid, living document designed to evolve as our industry does. If necessary, evolve this to your own reality as well.

You can download a Google Sheets copy of this maturity model here to begin using it with your client.

Download the maturity model

Why Stage 0?

In this search capabilities maturity model, I added an unconventional “Stage 0 - Counterproductive,” because organic search is unique in that they could do real damage and be at a deficit, not just at a baseline of zero.

In a scenario like this, the client has no collaboration inside the company or with the partner agency to do smart search work. Content may be thin, weak, duplicative, spun, or over-optimized. Perhaps their mobile experience is nonexistent or very poor. Maybe they’re even engaging in black hat SEO practices, and they have link-related or other penalties.

Choosing projects based on a client’s capabilities maturity

For a client that is starting on the lower end of the maturity scale, you may not recommend starting with advanced work like AMP and visual search technology, or even detailed Schema markup or extensive targeted link-building campaigns. You may have to start with the basics like securing the site, cleaning up information architecture, and fixing title tags and meta descriptions.

For a client that is starting on the higher end of the maturity scale, you wouldn’t want to waste their time recommending the basics — they’ve probably already done them. You're better off finding new and innovative opportunities to do great search work they haven’t already mastered.

But we’re just getting started...

But technical capabilities and knowledge are only beginning to scratch the surface with clients. This starts to solve for what you should implement, but doesn’t touch why it’s so hard to get your work implemented. The real problems tend to be a lot squishier, and aren’t so simple as checking some SEO best practices boxes.

How mature is your client’s search practice?

The real challenges to implementation tend to be organizational, people, integration, and process problems. Conducting a search maturity assessment with your client can be eye-opening as to what needs to be solved internally before great search work can be implemented and start reaping the rewards. Pair this with the technical capabilities maturity model above, and you have a powerhouse of knowledge and tools to help your client.

Before we dig in, I want to note one important caveat: While this maturity model focuses heavily on organizational adoption and process, I don’t want to suggest that process and procedure are substitutes for using your actual brain. You still have to think critically and make hard choices when you execute a best-in-class search program, and often that requires solving all-new problems that didn’t exist before and therefore don’t have a formal process.

Search practice maturity criteria

We measure an organization on several important criteria that contribute to the success of SEO:

  • Process, policy, or procedure - Do documented, repeatable processes for inclusion of organic search exist, and are they continually improving? Is it an organizational policy to include organic search in marketing efforts? This can mean that the process of including organic search in marketing initiatives is defined as a clear series of actions or steps taken, including both developing organic search strategy and implementing SEO tactics.
  • Personnel resources & integration - Does the necessary talent exist at the organization or within the service provider’s scope? Personnel resources may include SEO professionals, as well as support staff such as developers, data analysts, and copywriters necessary to implement organic search successfully. Active resources may work independently in a disjointed manner or collaboratively in an integrated manner.
  • Knowledge & learning - Because search is a constantly evolving field, is the organization knowledgeable about search and committed to continuously learning? Information can include existing knowledge, past experience, or training in organic search strategy and tactics. It can also include a commitment to learning more, possibly through willingness to undertake trainings, attendance of conferences, regular consumption of learning materials, or staying current in industry news and trends.
  • Means, capacity, & capabilities - Does the organization budget appropriately for and prioritize the organic search program? Means, capacity and capabilities can include being scoped into a client contract, adequate budget being allocated to the work, adequate human resources being allocated to the work, the capacity to complete the work when measured against competing demands, and the prioritization of search work alongside competing demands.
  • Planning & preparation - Is organic search aligned to business goals, brand goals, and/or campaign goals? Is organic search proactively planned, reactive, or not included at all? This measure evaluates how frequently organic search efforts are included in marketing efforts for a brand. It also measures how frequently the work is included proactively and pre-planned, as opposed to reactively as an afterthought. Work may be aligned to or disconnected from the "big picture."

Organizational search maturity

Click the image to see the full-size version.

Search practice stages of maturity

Stage 1 - Initial & ad hoc

At this stage, the organizations’ search application may be nonexistent, unstable, or uncontrolled. There may be rare and small SEO efforts, but they are entirely ad hoc and inconsistent, and retrofitted to the work after the fact, at best. They tend to lack any discernible goal orientation. If SEO exists, it is disconnected from larger goals, and not integrated with any other practices across the organization. They may be just beginning their search practice for the first time.

Stage 2 - Repeatable but reactive

These organizations are at least doing some search basics, though there is no rigorous use or enforcement of it. It is very reactive and in-the-moment while projects are being implemented; it is rarely pre-planned and often SEO is applied as an afterthought. They are executing only in the present or when it’s too late to do the highest caliber search work, but they are making an effort. SEO efforts may occasionally be going after goals, but it is unlikely to be tied to larger business goals. (Most of my client relationships have started here.)

Stage 3 - Defined & understood

These organizations have started to document their processes and are satisfactorily knowledgeable and competent in search. They have minimum standards for search best practices and process is emerging. Many people inside and outside the organization understand that search is important and are taking steps to integrate. There is a clear search strategy that aligns to organizational goals and processes. Proactive search preparation and planning happens prior to activating projects.

Stage 4 - Managed & capable

These organizations have proactive, predictable implementation of search work. They have quality-focused rules for products and processes, and can quickly detect and correct missteps. They have clearly defined processes for integration, implementation and oversight, but are flexible enough to adapt to a range of conditions without sacrificing quality. These organizations consider search part of their “way of life.”

Stage 5 - Efficient & optimizing

Organizations at this stage have a strong mastery of search and efficiently implementing as a matter of policy. They have cross-organizational integration and proactively work to strengthen their search performance. They are always improving the process through incremental or innovative change. They review and analyze their process and implementation to keep optimizing. These organizations could potentially be considered market-leading or innovative.

Scorecard exercise

Click the image to see the full-size version.

You are here

Before you can know how to get where you want to go, you need to know where you are. It's important to understand where the organization stands, and then where they need to be in the future. Going through the quantitative exercise of diagnosing their maturity can help everyone align to where to start.

You can use these scorecards to assess factors like leadership alignment to the value of search, employee availability and involvement, knowledge and training, process and standardization, their culture (or lack thereof) of data-driven problem-solving and continuous improvement, and even budget.

A collaborative exercise

This should be a deeper exercise than just punching numbers into a spreadsheet, and it certainly shouldn’t be a one-sided assessment from you as an outsider. It is much more valuable to ask several relevant people at multiple levels across the client organization to participate in this exercise, and can become much richer if you take the time to talk to people at various points in the process.

How to use the scorecard & diagnose maturity

Once you download the scorecards, follow these steps to begin the maturity assessment process.

  1. Client-side distribution - Distribute surveys to relevant stakeholders on the client's internal team. Ideally, these individuals serve at a variety of levels at the company and occupy a mix of roles relevant to the organic search practice. These could include CEO, CMO, Marketing VPs and directors, digital marketing coordinators, and in-house SEOs.
  2. Agency-side distribution - Distribute surveys to relevant stakeholders on the agency team. Ideally, these individuals serve at a variety of levels at the agency and occupy a mix of roles relevant to the organic search practice. These could include digital marketing coordinators, client engagement specialists, analysts, digital copywriters, or SEO practitioners.
  3. Assign a level of maturity to each criteria - Each survey participant can simply mark one "X" per category row in the column that most accurately reflects perception of the brand organization as it pertains to organic search. (For example, if the survey respondent feels that SEO process and procedure are non-existent based on the description, they can mark an “X” in the “Initial/Ad Hoc” column. Alternatively, if they feel they are extraordinarily advanced and efficient in their processes, they may mark the “X” in the “Efficient & Optimizing” column.)
  4. Collect the surveys - Assign a point value of 1, 2, 3, 4, or 5 to the responses from left to right in the scorecard. Average the points to get a final score for each. (For example, if five client stakeholders score their SEO process and procedure as 3, 4, 2, 3, 3 respectively, the average score is 3 for that criteria.)
  5. Comparing client to agency perception - You may also choose to ask survey respondents to denote whether they are client-side or agency-side so you can look at the data both in aggregate, and by client and agency separately, to determine if there is alignment or disagreement on where the brand falls on the maturity curve. This can be great material for discussion with the client that can open up conversations about why those differences in perception exist.

Screenshot of scorecard

To get your own scorecard, click the image and make a copy of the Google Sheet.

Choosing where to start

The goal is to identify together where to start working. This means finding the strengths to capitalize upon, areas of acceptability that can be nudged to a strength with a little work, weaknesses to improve upon, agreeing on areas to focus, and finally, how to get started tackling the first change together.

For a client that is starting on the low end of the maturity scale, it is unrealistic to expect that they have connected all the dots between important stakeholders, that they have a clearly defined and repeatable process, and that their search program is a well-oiled machine. If you don’t work together to solve the underlying problems like knowledge or adequate personnel resources first, you will struggle to get buy-in for the work or the resources to get it done, so it doesn’t matter what projects you recommend.

For a client that is advanced in a few areas, say process, planning, and capacity, but weaker in others like knowledge and capacity, that might suggest that you need to focus efforts on an education campaign to help the client prioritize the work and fit it into a busy queue.

For a client that is already advanced across the board, your role instead may be to keep the machine running while also helping them spot minor areas of improvement so they can keep iterating and perfecting the process. This client might also be ready for more advanced search strategies and tactical recommendations, or perhaps more robust integrations across additional disciplines.

One foot in front of the other

It’s rare that we live in a world of radical change where we overhaul everything en masse and see epic change overnight. We tweak, test, learn, and iterate. A maturity model is a continuum, and brands must evolve from one step to the next. Skipping levels is not an option. Some may also call this a “crawl, walk, run” approach.

Your goal as their trusted search advisor is not to help them leap from Stage 2 to Stage 5. Accomplishing that trajectory and speed of growth is exceedingly difficult and rare. Instead, focus your efforts on how the client can get to the next stage over the next 12 months. As they progress up the maturity model, the length of time it takes to unlock the next level may grow longer and longer.

Organizational Search Maturity

Click the image to see the full-size version.

Even when an organization reaches Stage 5, your/their work is not done. Master-level organizations continue to refine and optimize their processes and capabilities.

There is no finish line to search maturity

There is a French culinary phrase, “mise en place,” that refers to having everything — ingredients, tools, recipe — in its place to begin cooking most successfully. There are several key ingredients to any successful project implementation: buy-in, process, knowledge and skills, capacity, planning, and more.

As your client evolves up the maturity curve, you will see and feel a transition from thinking about aspects only once a project is sliding off the rails, to including these things real-time and reactively, to anticipating these before every project and doing your due diligence to come prepared. Essentially, the client can move from not being able to spell “SEO” to making SEO a part of their DNA by moving up these maturity curves.

It is important to revisit the maturity model discussion periodically — I recommend doing so at least annually — to level-set and realign with the client. Conducting this exercise again can remind us to pause and reflect on all we have accomplished since the first scoring. It can also re-energize stakeholders to make even more progress in the upcoming year.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!