December 13, 2018

3 Big Lessons from Interviewing John Mueller at SearchLove London – Whiteboard Friday

Posted by willcritchlow

When you've got one of Google's most helpful and empathetic voices willing to answer your most pressing SEO questions, what do you ask? Will Critchlow recently had the honor of interviewing Google's John Mueller at SearchLove London, and in this week's edition of Whiteboard Friday he shares his best lessons from that session, covering the concept of Domain Authority, the great subdomain versus subfolder debate, and a view into the technical workings of noindex/nofollow.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, Whiteboard Friday fans. I'm Will Critchlow from Distilled, and I found myself in Seattle, wanted to record another Whiteboard Friday video and talk through some things that I learned recently when I got to sit down with John Mueller from Google at our SearchLove London conference recently.

So I got to interview John on stage, and, as many of you may know, John is a webmaster relations guy at Google and really a point of contact for many of us in the industry when there are technical questions or questions about how Google is treating different things. If you followed some of the stuff that I've written and talked about in the past, you'll know that I've always been a little bit suspicious of some of the official lines that come out of Google and felt like either we don't get the full story or we haven't been able to drill in deep enough and really figure out what's going on.

I was under no illusions that I might be able to completely fix this this in one go, but I did want to grill John on a couple of specific things where I felt like we hadn't maybe asked things clearly enough or got the full story. Today I wanted to run through a few things that I learned when John and I sat down together. A little side note, I found it really fascinating doing this kind of interview. I sat on stage in a kind of journalistic setting. I had never done this before. Maybe I'll do a follow-up Whiteboard Friday one day on things I learned and how to run interviews.

1. Does Google have a "Domain Authority" concept?

But the first thing that I wanted to quiz John about was this domain authority idea. So here we are on Moz. Moz has a proprietary metric called domain authority, DA. I feel like when, as an industry, we've asked Google, and John in particular, about this kind of thing in the past, does Google have a concept of domain authority, it's got bundled up with feeling like, oh, he's had an easy way out of being able to answer and say, "No, no, that's a proprietary Moz metric. We don't have that."

I felt like that had got a bit confusing, because our suspicion is that there is some kind of an authority or a trust metric that Google has and holds at a domain level. We think that's true, but we felt like they had always been able to wriggle out of answering the question. So I said to John, "Okay, I am not asking you do you use Moz's domain authority metric in your ranking factors. Like we know that isn't the case. But do you have something a little bit like it?"

Yes, Google has metrics that map into similar things

John said yes. He said yes, they have metrics that, his exact quote was, "map into similar things."My way of phrasing this was this is stuff that is at the domain level. It's based on things like link authority, and it is something that is used to understand performance or to rank content across an entire domain. John said yes, they have something similar to that.

New content inherits those metrics

They use it in particular when they discover new content on an existing domain. New content, in some sense, can inherit some of the authority from the domain, and this is part of the reason why we figured they must have something like this, because we've seen identical content perform differently on different sites. We know that there's something to this. So yes, John confirmed that until they have some of those metrics developed, when they've seen a bit of content for long enough, and it can have its own link metrics and usage metrics, in the intervening time up until that point it can inherit some of this stuff from the domain.

Not wholly link-based

He did also just confirm that it's not just link-based. This is not just a domain-level PageRank type thing.

2. Subdomains versus subfolders

This led me into the second thing that I really wanted to get out of him, which was — and when I raised this, I got kind of an eye roll, "Are we really going down this rabbit hole" — the subdomain versus subfolder question. You might have seen me talk about this. You might have seen people like Rand talk about this, where we've seen cases and we have case studies of moving blog.example.com to example.com/blog and changing nothing else and getting an uplift.

We know something must be going on, and yet the official line out of Google has for a very long time been: "We don't treat these things differently. There is nothing special about subfolders. We're perfectly happy with subdomains. Do whatever is right for your business." We've had this kind of back-and-forth a few times. The way I put it to John was I said, "We have seen these case studies. How would you explain this?"

They try to figure out what belongs to the site

To his credit, John said, "Yes, we've seen them as well." So he said, yes, Google has also seen these things. He acknowledged this is true. He acknowledged that it happens. The way he explained it connects back into this Domain Authority thing in my mind, which is to say that the way they think about it is: Are these pages on this subdomain part of the same website as things on the main domain?

That's kind of the main question. They try and figure out, as he put it, "what belongs to this site." We all know of sites where subdomains are entirely different sites. If you think about a blogspot.com or a WordPress.com domain, subdomains might be owned and managed by entirely different people, and there would be no reason for that authority to pass across. But what Google is trying to do and is trying to say, "Is this subdomain part of this main site?"

Sometimes this includes subdomains and sometimes not

He said sometimes they determine that it is, and sometimes they determine that it is not. If it is part of the site, in their estimation, then they will treat it as equivalent to a subfolder. This, for me, pretty much closes this loop. I think we understand each other now, which is Google is saying, in these certain circumstances, they will be treated identically, but there are circumstances where it can be treated differently.

My recommendation stays what it's always been, which is 100% if you're starting from the outset, put it on a subfolder. There's no upside to the subdomain. Why would you risk the fact that Google might treat it as a separate site? If it is currently on a subdomain, then it's a little trickier to make that case. I would personally be arguing for the integration and for making that move.

If it's treated as part of the site, a subdomain is equivalent to a subfolder

But unfortunately, but somewhat predictably, I couldn't tie John down to any particular way of telling if this is the case. If your content is currently on a subdomain, there isn't really any way of telling if Google is treating it differently, which is a shame, but it's somewhat predictable. But at least we understand each other now, and I think we've kind of got to the root of the confusion. These case studies are real. This is a real thing. Certainly in certain circumstances moving from the subdomain to the subfolder can improve performance.

3. Noindex's impact on nofollow

The third thing that I want to talk about is a little bit more geeked out and technical, and also, in some sense, it leads to some bigger picture lessons and thinking. A little while ago John kind of caught us out by talking about how if you have a page that you no index and keep it that way for a long time, that Google will eventually treat that equivalently to a no index, no follow.

In the long-run, a noindex page's links effectively become nofollow

In other words, the links off that page, even if you've got it as a no index, follow, the links off that page will be effectively no followed. We found that a little bit confusing and surprising. I mean I certainly felt like I had assumed it didn't work that way simply because they have the no index, follow directive, and the fact that that's a thing seems to suggest that it ought to work that way.

It's been this way for a long time

It wasn't really so much about the specifics of this, but more the like: How did we not know this? How did this come about and so forth? John talked about how, firstly, it has been this way for a long time. I think he was making the point none of you all noticed, so how big a deal can this really be? I put it back to him that this is kind of a subtle thing and very hard to test, very hard to extract out the different confounding factors that might be going on.

I'm not surprised that, as an industry, we missed it. But the point being it's been this way for a long time, and Google's view and certainly John's view was that this hadn't been hidden from us so much as the people who knew this hadn't realized that they needed to tell anyone. The actual engineers working on the search algorithm, they had a curse of knowledge.

The curse of knowledge: engineers didn't realize webmasters had the wrong idea

They knew it worked this way, and they had never realized that webmasters didn't know that or thought any differently. This was one of the things that I was kind of trying to push to John a little more was kind of saying, "More of this, please. Give us more access to the engineers. Give us more insight into their way of thinking. Get them to answer more questions, because then out of that we'll spot the stuff that we can be like, 'Oh, hey, that thing there, that was something I didn't know.' Then we can drill deeper into that."

That led us into a little bit of a conversation about how John operates when he doesn't know the answer, and so there were some bits and pieces that were new to me at least about how this works. John said he himself is generally not attending search quality meetings. The way he works is largely off his knowledge and knowledge base type of content, but he has access to engineers.

They're not dedicated to the webmaster relations operation. He's just going around the organization, finding individual Google engineers to answer these questions. It was somewhat interesting to me at least to find that out. I think hopefully, over time, we can generally push and say, "Let's look for those engineers. John, bring them to the front whenever they want to be visible, because they're able to answer these kinds of questions that might just be that curse of knowledge that they knew this all along and we as marketers hadn't figured out this was how things worked."

That was my quick run-through of some of the things that I learned when I interviewed John. We'll link over to more resources and transcripts and so forth. But it's been a blast. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

December 13, 2018

Niche Ranking Factors FAQ

This year, we’ve been talking a lot about niche ranking factors, with Searchmetrics Founder, Marcus Tober, holding talks at numerous SEO conferences, and our ...

The post Niche Ranking Factors FAQ appeared first on Searchmetrics SEO Blog.

December 11, 2018

Why Local Businesses Will Need Websites More than Ever in 2019

Posted by MiriamEllis

64% of 1,411 surveyed local business marketers agree that Google is becoming the new “homepage” for local businesses. Via Moz State of Local SEO Industry Report

...but please don’t come away with the wrong storyline from this statistic.

As local brands and their marketers watch Google play Trojan horse, shifting from top benefactor to top competitor by replacing former “free” publicity with paid packs, Local Service Ads, zero-click SERPs, and related structures, it’s no surprise to see forum members asking, “Do I even need a website anymore?”

Our answer to this question is,“Yes, you’ve never needed a website more than you will in 2019.” In this post, we’ll examine:

  • Why it looks like local businesses don’t need websites
  • Statistical proofs of why local businesses need websites now more than ever
  • The current status of local business websites and most-needed improvements

How Google stopped bearing so many gifts

Within recent memory, a Google query with local intent brought up a big pack of ten nearby businesses, with each entry taking the user directly to these brands’ websites for all of their next steps. A modest amount of marketing effort was rewarded with a shower of Google gifts in the form of rankings, traffic, and conversions.

Then these generous SERPs shrank to seven spots, and then three, with the mobile sea change thrown into the bargain and consisting of layers and layers of Google-owned interfaces instead of direct-to-website links. In 2018, when we rustle through the wrapping paper, the presents we find from Google look cheaper, smaller, and less magnificent.

Consider these five key developments:

1) Zero-click mobile SERPs

This slide from a recent presentation by Rand Fishkin encapsulates his findings regarding the growth of no-click SERPs between 2016–2018. Mobile users have experienced a 20% increase in delivery of search engine results that don’t require them to go any deeper than Google’s own interface.

2) The encroachment of paid ads into local packs

When Dr. Peter J. Myers surveyed 11,000 SERPs in 2018, he found that 35% of competitive local packs feature ads.

3) Google becoming a lead gen agency

At last count, Google’s Local Service Ads program via which they interposition themselves as the paid lead gen agent between businesses and consumers has taken over 23 business categories in 77 US cities.

4) Even your branded SERPs don’t belong to you

When a user specifically searches for your brand and your Google Knowledge Panel pops up, you can likely cope with the long-standing “People Also Search For” set of competitors at the bottom of it. But that’s not the same as Google allowing Groupon to advertise at the top of your KP, or putting lead gen from Doordash and GrubHub front and center to nickel and dime you on your own customers’ orders.

5) Google is being called the new “homepage” for local businesses

As highlighted at the beginning of this post, 64% of marketers agree that Google is becoming the new “homepage” for local businesses. This concept, coined by Mike Blumenthal, signifies that a user looking at a Google Knowledge Panel can get basic business info, make a phone call, get directions, book something, ask a question, take a virtual tour, read microblog posts, see hours of operation, thumb through photos, see busy times, read and leave reviews. Without ever having to click through to a brand’s domain, the user may be fully satisfied.

“Nothing is enough for the man to whom enough is too little.”
- Epicurus

There are many more examples we could gather, but they can all be summed up in one way: None of Google’s most recent local initiatives are about driving customers to brands’ own websites. Local SERPs have shrunk and have been re-engineered to keep users within Google’s platforms to generate maximum revenue for Google and their partners.

You may be as philosophical as Epicurus about this and say that Google has every right to be as profitable as they can with their own product, even if they don’t really need to siphon more revenue off local businesses. But if Google’s recent trajectory causes your brand or agency to conclude that websites have become obsolete in this heavily controlled environment, please keep reading.

Your website is your bedrock

“65% of 1,411 surveyed marketers observe strong correlation between organic and local rank.” - Via Moz State of Local SEO Industry Report

What this means is that businesses which rank highly organically are very likely to have high associated local pack rankings. In the following screenshot, if you take away the directory-type platforms, you will see how the brand websites ranking on page 1 for “deli athens ga” are also the two businesses that have made it into Google’s local pack:

How often do the top 3 Google local pack results also have a 1st page organic rankings?

In a small study, we looked at 15 head keywords across 7 US cities and towns. This yielded 315 possible entries in Google’s local pack. Of that 315, 235 of the businesses ranking in the local packs also had page 1 organic rankings. That’s a 75% correlation between organic website rankings and local pack presence.

*It’s worth noting that where local and organic results did not correlate, it was sometimes due the presence of spam GMB listings, or to mystery SERPs that did not make sense at first glance — perhaps as a result of Google testing, in some cases.

Additionally, many local businesses are not making it to the first page of Google anymore in some categories because the organic SERPs are inundated with best-of lists and directories. Often, local business websites were pushed down to the second page of the organic results. In other words, if spam, “best-ofs,” and mysteries were removed, the local-organic correlation would likely be much higher than 75%.

Further, one recent study found that even when Google’s Local Service Ads are present, 43.9% of clicks went to the organic SERPs. Obviously, if you can make it to the top of the organic SERPs, this puts you in very good CTR shape from a purely organic standpoint.

Your takeaway from this

The local businesses you market may not be able to stave off the onslaught of Google’s zero-click SERPs, paid SERPs, and lead gen features, but where “free” local 3-packs still exist, your very best bet for being included in them is to have the strongest possible website. Moreover, organic SERPs remain a substantial source of clicks.

Far from it being the case that websites have become obsolete, they are the firmest bedrock for maintaining free local SERP visibility amidst an increasing scarcity of opportunities.

This calls for an industry-wide doubling down on organic metrics that matter most.

Bridging the local-organic gap

“We are what we repeatedly do. Excellence, then, is not an act, but a habit.”
- Aristotle

A 2017 CNBC survey found that 45% of small businesses have no website, and, while most large enterprises have websites, many local businesses qualify as “small.”

Moreover, a recent audit of 9,392 Google My Business listings found that 27% have no website link.

When asked which one task 1,411 marketers want clients to devote more resources to, it’s no coincidence that 66% listed a website-oriented asset. This includes local content development, on-site optimization, local link building, technical analysis of rankings/traffic/conversions, and website design as shown in the following Moz survey graphic:

In an environment in which websites are table stakes for competitive local pack rankings, virtually all local businesses not only need one, but they need it to be as strong as possible so that it achieves maximum organic rankings.

What makes a website strong?

The Moz Beginner’s Guide to SEO offers incredibly detailed guidelines for creating the best possible website. While we recommend that everyone marketing a local business read through this in-depth guide, we can sum up its contents here by stating that strong websites combine:

  • Technical basics
  • Excellent usability
  • On-site optimization
  • Relevant content publication
  • Publicity

For our present purpose, let’s take a special look at those last three elements.

On-site optimization and relevant content publication

There was a time when on-site SEO and content development were treated almost independently of one another. And while local businesses will need a make a little extra effort to put their basic contact information in prominent places on their websites (such as the footer and Contact Us page), publication and optimization should be viewed as a single topic. A modern strategy takes all of the following into account:

  • Keyword and real-world research tell a local business what consumers want
  • These consumer desires are then reflected in what the business publishes on its website, including its homepage, location landing pages, about page, blog and other components
  • Full reflection of consumer desires includes ensuring that human language (discovered via keyword and real-world research) is implemented in all elements of each page, including its tags, headings, descriptions, text, and in some cases, markup

What we’re describing here isn’t a set of disconnected efforts. It’s a single effort that’s integral to researching, writing, and publishing the website. Far from stuffing keywords into a tag or a page’s content, focus has shifted to building topical authority in the eyes of search engines like Google by building an authoritative resource for a particular consumer demographic. The more closely a business is able to reflect customers’ needs (including the language of their needs), in every possible component of its website, the more relevant it becomes.

A hypothetical example of this would be a large medical clinic in Dallas. Last year, their phone staff was inundated with basic questions about flu shots, like where and when to get them, what they cost, would they cause side effects, what about side effects on people with pre-existing health conditions, etc. This year, the medical center’s marketing team took a look at Moz Keyword Explorer and saw that there’s an enormous volume of questions surrounding flu shots:

This tiny segment of the findings of the free keyword research tool, Answer the Public, further illustrates how many questions people have about flu shots:

The medical clinic need not compete nationally for these topics, but at a local level, a page on the website can answer nearly every question a nearby patient could have about this subject. The page, created properly, will reflect human language in its tags, headings, descriptions, text, and markup. It will tell all patients where to come and when to come for this procedure. It has the potential to cut down on time-consuming phone calls.

And, finally, it will build topical authority in the eyes of Google to strengthen the clinic’s chances of ranking well organically… which can then translate to improved local rankings.

It’s important to note that keyword research tools typically do not reflect location very accurately, so research is typically done at a national level, and then adjusted to reflect regional or local language differences and geographic terms, after the fact. In other words, a keyword tool may not accurately reflect exactly how many local consumers in Dallas are asking “Where do I get a flu shot?”, but keyword and real-world research signals that this type of question is definitely being asked. The local business website can reflect this question while also adding in the necessary geographic terms.

Local link building must be brought to the fore of publicity efforts

Moz’s industry survey found that more than one-third of respondents had no local link building strategy in place. Meanwhile, link building was listed as one of the top three tasks to which marketers want their clients to devote more resources. There’s clearly a disconnect going on here. Given the fundamental role links play in building Domain Authority, organic rankings, and subsequent local rankings, building strong websites means bridging this gap.

First, it might help to examine old prejudices that could cause local business marketers and their clients to feel dubious about link building. These most likely stem from link spam which has gotten so out of hand in the general world of SEO that Google has had to penalize it and filter it to the best of their ability.

Not long ago, many digital-only businesses were having a heyday with paid links, link farms, reciprocal links, abusive link anchor text and the like. An online company might accrue thousands of links from completely irrelevant sources, all in hopes of escalating rank. Clearly, these practices aren’t ones an ethical business can feel good about investing in, but they do serve as an interesting object lesson, especially when a local marketer can point out to a client, that best local links are typically going to result from real-world relationship-building.

Local businesses are truly special because they serve a distinct, physical community made up of their own neighbors. The more involved a local business is in its own community, the more naturally link opportunities arise from things like local:

  • Sponsorships
  • Event participation and hosting
  • Online news
  • Blogs
  • Business associations
  • B2B cross-promotions

There are so many ways a local business can build genuine topical and domain authority in a given community by dint of the relationships it develops with neighbors.

An excellent way to get started on this effort is to look at high-ranking local businesses in the same or similar business categories to discover what work they’ve put in to achieve a supportive backlink profile. Moz Link Intersect is an extremely actionable resource for this, enabling a business to input its top competitors to find who is linking to them.

In the following example, a small B&B in Albuquerque looks up two luxurious Tribal resorts in its city:

Link Intersect then lists out a blueprint of opportunities, showing which links one or both competitors have earned. Drilling down, the B&B finds that Marriott.com is linking to both Tribal resorts on an Albuquerque things-to-do page:

The small B&B can then try to earn a spot on that same page, because it hosts lavish tea parties as a thing-to-do. Outreach could depend on the B&B owner knowing someone who works at the local Marriott personally. It could include meeting with them in person, or on the phone, or even via email. If this outreach succeeds, an excellent, relevant link will have been earned to boost organic rank, underpinning local rank.

Then, repeat the process. Aristotle might well have been speaking of link building when he said we are what we repeatedly do and that excellence is a habit. Good marketers can teach customers to have excellent habits in recognizing a good link opportunity when they see it.

Taken altogether

Without a website, a local business lacks the brand-controlled publishing and link-earning platform that so strongly influences organic rankings. In the absence of this, the chances of ranking well in competitive local packs will be significantly less. Taken altogether, the case is clear for local businesses investing substantially in their websites.

Acting now is actually a strategy for the future

“There is nothing permanent except change.”
- Heraclitus

You’ve now determined that strong websites are fundamental to local rankings in competitive markets. You’ve absorbed numerous reasons to encourage local businesses you market to prioritize care of their domains. But there’s one more thing you’ll need to be able to convey, and that’s a sense of urgency.

Right now, every single customer you can still earn from a free local pack listing is immensely valuable for the future.

This isn’t a customer you’ve had to pay Google for, as you very well might six months, a year, or five years from now. Yes, you’ve had to invest plenty in developing the strong website that contributed to the high local ranking, but you haven’t paid a penny directly to Google for this particular lead. Soon, you may be having to fork over commissions to Google for a large portion of your new customers, so acting now is like insurance against future spend.

For this to work out properly, local businesses must take the leads Google is sending them right now for free, and convert them into long-term, loyal customers, with an ultimate value of multiple future transactions without Google as a the middle man. And if these freely won customers can be inspired to act as word-of-mouth advocates for your brand, you will have done something substantial to develop a stream of non-Google-dependent revenue.

This offer may well expire as time goes by. When it comes to the capricious local SERPs, marketers resemble the Greek philosophers who knew that change is the only constant. The Trojan horse has rolled into every US city, and it’s a gift with a questionable shelf life. We can’t predict if or when free packs might become obsolete, but we share your concerns about the way the wind is blowing.

What we can see clearly right now is that websites will be anything but obsolete in 2019. Rather, they are the building blocks of local rankings, precious free leads, and loyal revenue, regardless of how SERPs may alter in future.

For more insights into where local businesses should focus in 2019, be sure to explore the Moz State of Local SEO industry report:

Read the State of Local SEO industry report


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

December 10, 2018

Evolving Keyword Research to Match Your Buyer’s Journey

Posted by matthew_jkay

Keyword research has been around as long as the SEO industry has. Search engines built a system that revolves around users entering a term or query into a text entry field, hitting return, and receiving a list of relevant results. As the online search market expanded, one clear leader emerged — Google — and with it they brought AdWords (now Google Ads), an advertising platform that allowed organizations to appear on search results pages for keywords that organically they might not.

Within Google Ads came a tool that enabled businesses to look at how many searches there were per month for almost any query. Google Keyword Planner became the de facto tool for keyword research in the industry, and with good reason: it was Google’s data. Not only that, Google gave us the ability to gather further insights due to other metrics Keyword Planner provided: competition and suggested bid. Whilst these keywords were Google Ads-oriented metrics, they gave the SEO industry an indication of how competitive a keyword was.

The reason is obvious. If a keyword or phrase has higher competition (i.e. more advertisers bidding to appear for that term) it’s likely to be more competitive from an organic perspective. Similarly, a term that has a higher suggested bid means it’s more likely to be a competitive term. SEOs dined on this data for years, but when the industry started digging a bit more into the data, we soon realized that while useful, it was not always wholly accurate. Moz, SEMrush, and other tools all started to develop alternative volume and competitive metrics using Clickstream data to give marketers more insights.

Now industry professionals have several software tools and data outlets to conduct their keyword research. These software companies will only improve in the accuracy of their data outputs. Google’s data is unlikely to significantly change; their goal is to sell ad space, not make life easy for SEOs. In fact, they've made life harder by using volume ranges for Google Ads accounts with low activity. SEO tools have investors and customers to appease and must continually improve their products to reduce churn and grow their customer base. This makes things rosy for content-led SEO, right?

Well, not really.

The problem with historical keyword research is twofold:

1. SEOs spend too much time thinking about the decision stage of the buyer’s journey (more on that later).

2. SEOs spend too much time thinking about keywords, rather than categories or topics.

The industry, to its credit, is doing a lot to tackle issue number two. “Topics over keywords” is something that is not new as I’ll briefly come to later. Frameworks for topic-based SEO have started to appear over the last few years. This is a step in the right direction. Organizing site content into categories, adding appropriate internal linking, and understanding that one piece of content can rank for several variations of a phrase is becoming far more commonplace.

What is less well known (but starting to gain traction) is point one. But in order to understand this further, we should dive into what the buyer’s journey actually is.

What is the buyer’s journey?

The buyer’s or customer’s journey is not new. If you open marketing text books from years gone by, get a college degree in marketing, or even just go on general marketing blogs you’ll see it crop up. There are lots of variations of this journey, but they all say a similar thing. No matter what product or service is bought, everyone goes through this journey. This could be online or offline — the main difference is that depending on the product, person, or situation, the amount of time this journey takes will vary — but every buyer goes through it. But what is it, exactly? For the purpose of this article, we’ll focus on three stages: awareness, consideration, & decision.

Awareness

The awareness stage of the buyer’s journey is similar to problem discovery, where a potential customer realizes that they have a problem (or an opportunity) but they may not have figured out exactly what that is yet.

Search terms at this stage are often question-based — users are researching around a particular area.

Consideration

The consideration stage is where a potential consumer has defined what their problem or opportunity is and has begun to look for potential solutions to help solve the issue they face.

Decision

The decision stage is where most organizations focus their attention. Normally consumers are ready to buy at this stage and are often doing product or vendor comparisons, looking at reviews, and searching for pricing information.

To illustrate this process, let’s take two examples: buying an ice cream and buying a holiday.

Being low-value, the former is not a particularly considered purchase, but this journey still takes place. The latter is more considered. It can often take several weeks or months for a consumer to decide on what destination they want to visit, let alone a hotel or excursions. But how does this affect keyword research, and the content which we as marketers should provide?

At each stage, a buyer will have a different thought process. It’s key to note that not every buyer of the same product will have the same thought process but you can see how we can start to formulate a process.

The Buyer’s Journey - Holiday Purchase

The above table illustrates the sort of queries or terms that consumers might use at different stages of their journey. The problem is that most organizations focus all of their efforts on the decision end of the spectrum. This is entirely the right approach to take at the start because you’re targeting consumers who are interested in your product or service then and there. However, in an increasingly competitive online space you should try and find ways to diversify and bring people into your marketing funnel (which in most cases is your website) at different stages.

I agree with the argument that creating content for people earlier in the journey will likely mean lower conversion rates from visitor to customer, but my counter to this would be that you're also potentially missing out on people who will become customers. Further possibilities to at least get these people into your funnel include offering content downloads (gated content) to capture user’s information, or remarketing activity via Facebook, Google Ads, or other retargeting platforms.

Moving from keywords to topics

I’m not going to bang this drum too loudly. I think many in of the SEO community have signed up to the approach that topics are more important than keywords. There are quite a few resources on this listed online, but what forced it home for me was Cyrus Shepard’s Moz article in 2014. Much, if not all, of that post still holds true today.

What I will cover is an adoption of HubSpot’s Topic Cluster model. For those unaccustomed to their model, HubSpot’s approach formalizes and labels what many search marketers have been doing for a while now. The basic premise is instead of having your site fragmented with lots of content across multiple sections, all hyperlinking to each other, you create one really in-depth content piece that covers a topic area broadly (and covers shorter-tail keywords with high search volume), and then supplement this page with content targeting the long-tail, such as blog posts, FAQs, or opinion pieces. HubSpot calls this "pillar" and "cluster" content respectively.

Source: Matt Barby / HubSpot

The process then involves taking these cluster pages and linking back to the pillar page using keyword-rich anchor text. There’s nothing particularly new about this approach aside from formalizing it a bit more. Instead of having your site’s content structured in such a way that it's fragmented and interlinking between lots of different pages and topics, you keep the internal linking within its topic, or content cluster. This video explains this methodology further. While we accept this model may not fit every situation, and nor is it completely perfect, it’s a great way of understanding how search engines are now interpreting content.

At Aira, we’ve taken this approach and tried to evolve it a bit further, tying these topics into the stages of the buyer’s journey while utilizing several data points to make sure our outputs are based off as much data as we can get our hands on. Furthermore, because pillar pages tend to target shorter-tail keywords with high search volume, they're often either awareness- or consideration-stage content, and thus not applicable for decision stage. We term our key decision pages “target pages,” as this should be a primary focus of any activity we conduct.

We’ll also look at the semantic relativity of the keywords reviewed, so that we have a “parent” keyword that we’re targeting a page to rank for, and then children of that keyword or phrase that the page may also rank for, due to its similarity to the parent. Every keyword is categorized according to its stage in the buyer’s journey and whether it's appropriate for a pillar, target, or cluster page. We also add two further classifications to our keywords: track & monitor and ignore. Definitions for these five keyword types are listed below:

Pillar page

A pillar page covers all aspects of a topic on a single page, with room for more in-depth reporting in more detailed cluster blog posts that hyperlink back to the pillar page. A keyword tagged with pillar page will be the primary topic and the focus of a page on the website. Pillar pages should be awareness- or consideration-stage content.

A great pillar page example I often refer to is HubSpot’s Facebook marketing guide or Mosi-guard’s insect bites guide (disclaimer: probably don’t click through if you don’t like close-up shots of insects!).

Cluster page

A cluster topic page for the pillar focuses on providing more detail for a specific long-tail keyword related to the main topic. This type of page is normally associated with a blog article but could be another type of content, like an FAQ page.

Good examples within the Facebook marketing topic listed above are HubSpot’s posts:

For Mosi-guard, they’re not utilizing internal links within the copy of the other blogs, but the "older posts" section at the bottom of the blog is referencing this guide:

Target page

Normally a keyword or phrase linked to a product or service page, e.g. nike trainers or seo services. Target pages are decision-stage content pieces.

HubSpot’s target content is their social media software page, with one of Mosi-guard’s target pages being their natural spray product.

Track & monitor

A keyword or phrase that is not the main focus of a page, but could still rank due to its similarity to the target page keyword. A good example of this might be seo services as the target page keyword, but this page could also rank for seo agency, seo company, etc.

Ignore

A keyword or phrase that has been reviewed but is not recommended to be optimized for, possibly due to a lack of search volume, it’s too competitive, it won’t be profitable, etc.

Once the keyword research is complete, we then map our keywords to existing website pages. This gives us a list of mapped keywords and a list of unmapped keywords, which in turn creates a content gap analysis that often leads to a content plan that could last for three, six, or twelve-plus months.

Putting it into practice

I’m a firm believer in giving an example of how this would work in practice, so I’m going to walk through one with screenshots. I’ll also provide a template of our keyword research document for you to take away.

1. Harvesting keywords

The first step in the process is similar, if not identical, to every other keyword research project. You start off with a batch of keywords from the client or other stakeholders that the site wants to rank for. Most of the industry call this a seed keyword list. That keyword list is normally a minimum of 15–20 keywords, but can often be more if you’re dealing with an e-commerce website with multiple product lines.

This list is often based off nothing more than opinion: “What do we think our potential customers will search for?” It’s a good starting point, but you need the rest of the process to follow on to make sure you’re optimizing based off data, not opinion.

2. Expanding the list

Once you’ve got that keyword list, it’s time to start utilizing some of the tools you have at your disposal. There are lots, of course! We tend to use a combination of Moz Keyword Explorer, Answer the Public, Keywords Everywhere, Google Search Console, Google Analytics, Google Ads, ranking tools, and SEMrush.

The idea of this list is to start thinking about keywords that the organization may not have considered before. Your expanded list will include obvious synonyms from your list. Take the example below:

Seed Keywords

Expanded Keywords

ski chalet

ski chalet

ski chalet rental

ski chalet hire

ski chalet [location name]

etc

There are other examples that should be considered. A client I worked with in the past once gave a seed keyword of “biomass boilers.” But after keyword research was conducted, a more colloquial term for “biomass boilers” in the UK is “wood burners.” This is an important distinction and should be picked up as early in the process as possible. Keyword research tools are not infallible, so if budget and resource allows, you may wish to consult current and potential customers about which terms they might use to find the products or services being offered.

3. Filtering out irrelevant keywords

Once you’ve expanded the seed keyword list, it’s time to start filtering out irrelevant keywords. This is pretty labor-intensive and involves sorting through rows of data. We tend to use Moz’s Keyword Explorer, filter by relevancy, and work our way down. As we go, we’ll add keywords to lists within the platform and start to try and sort things by topic. Topics are fairly subjective, and often you’ll get overlap between them. We’ll group similar keywords and phrases together in a topic based off the semantic relativity of those phrases. For example:

Topic

Keywords

ski chalet

ski chalet

ski chalet rental

ski chalet hire

ski chalet [location name]

catered chalet

catered chalet

luxury catered chalet

catered chalet rental

catered chalet hire

catered chalet [location name]

ski accommodation

ski accommodation

cheap ski accommodation

budget ski accommodation

ski accomodation [location name]

Many of the above keywords are decision-based keywords — particularly those with rental or hire in them. They're showing buying intent. We’ll then try to put ourselves in the mind of the buyer and come up with keywords towards the start of the buyer’s journey.

Topic

Keywords

Buyer’s stage

ski resorts

ski resorts

best ski resorts

ski resorts europe

ski resorts usa

ski resorts canada

top ski resorts

cheap ski resorts

luxury ski resorts

Consideration

skiing

skiing

skiing guide

skiing beginner’s guide

Consideration

family holidays

family holidays

family winter holidays

family trips

Awareness

This helps us cater to customers that might not be in the frame of mind to purchase just yet — they're just doing research. It means we cast the net wider. Conversion rates for these keywords are unlikely to be high (at least, for purchases or enquiries) but if utilized as part of a wider marketing strategy, we should look to capture some form of information, primarily an email address, so we can send people relevant information via email or remarketing ads later down the line.

4. Pulling in data

Once you’ve expanded the seed keywords out, Keyword Explorer’s handy list function enables your to break things down into separate topics. You can then export that data into a CSV and start combining it with other data sources. If you have SEMrush API access, Dave Sottimano’s API Library is a great time saver; otherwise, you may want to consider uploading the keywords into the Keywords Everywhere Chrome extension and manually exporting the data and combining everything together. You should then have a spreadsheet that looks something like this:

You could then add in additional data sources. There’s no reason you couldn’t combine the above with volumes and competition metrics from other SEO tools. Consider including existing keyword ranking information or Google Ads data in this process. Keywords that convert well on PPC should do the same organically and should therefore be considered. Wil Reynolds talks about this particular tactic a lot.

5. Aligning phrases to the buyer’s journey

The next stage of the process is to start categorizing the keywords into the stage of the buyer’s journey. Something we’ve found at Aira is that keywords don’t always fit into a predefined stage. Someone looking for “marketing services” could be doing research about what marketing services are, but they could also be looking for a provider. You may get keywords that could be either awareness/consideration or consideration/decision. Use your judgement, and remember this is subjective. Once complete, you should end up with some data that looks similar to this:

This categorization is important, as it starts to frame what type of content is most appropriate for that keyword or phrase.

The next stage of this process is to start noticing patterns in keyphrases and where they get mapped to in the buyer’s journey. Often you’ll see keywords like “price” or ”cost” at the decision stage and phrases like “how to” at the awareness stage. Once you start identifying these patterns, possibly using a variation of Tom Casano’s keyword clustering approach, you can then try to find a way to automate so that when these terms appear in your keyword column, the intent automatically gets updated.

Once completed, we can then start to define each of our keywords and give them a type:

  • Pillar page
  • Cluster page
  • Target page
  • Track & monitor
  • Ignore

We use this document to start thinking about what type of content is most effective for that piece given the search volume available, how competitive that term is, how profitable the keyword could be, and what stage the buyer might be at. We’re trying to find that sweet spot between having enough search volume, ensuring we can actually rank for that keyphrase (there’s no point in a small e-commerce startup trying to rank for “buy nike trainers”), and how important/profitable that phrase could be for the business. The below Venn diagram illustrates this nicely:

We also reorder the keywords so keywords that are semantically similar are bucketed together into parent and child keywords. This helps to inform our on-page recommendations:

From the example above, you can see "digital marketing agency" as the main keyword, but “digital marketing services” & “digital marketing agency uk” sit underneath.

We also use conditional formatting to help identify keyword page types:

And then sheets to separate topics out:

Once this is complete, we have a data-rich spreadsheet of keywords that we then work with clients on to make sure we’ve not missed anything. The document can get pretty big, particularly when you’re dealing with e-commerce websites that have thousands of products.

5. Keyword mapping and content gap analysis

We then map these keywords to existing content to ensure that the site hasn’t already written about the subject in the past. We often use Google Search Console data to do this so we understand how any existing content is being interpreted by the search engines. By doing this we’re creating our own content gap analysis. An example output can be seen below:

The above process takes our keyword research and then applies the usual on-page concepts (such as optimizing meta titles, URLs, descriptions, headings, etc) to existing pages. We’re also ensuring that we’re mapping our user intent and type of page (pillar, cluster, target, etc), which helps us decide what sort of content the piece should be (such as a blog post, webinar, e-book, etc). This process helps us understand what keywords and phrases the site is not already being found for, or is not targeted to.

Free template

I promised a template Google Sheet earlier in this blog post and you can find that here.

Do you have any questions on this process? Ways to improve it? Feel free to post in the comments below or ping me over on Twitter!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!