October 11, 2018

How to Create a Local Marketing Results Dashboard in Google Data Studio – Whiteboard Friday

Posted by DiTomaso

Showing clients that you're making them money is one of the most important things you can communicate to them, but it's tough to know how to present your results in a way they can easily understand. That's where Google Data Studio comes in. In this week's edition of Whiteboard Friday, our friend Dana DiTomaso shares how to create a client-friendly local marketing results dashboard in Google Data Studio from start to finish.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, Moz fans. My name is Dana DiTomaso. I'm President and partner of Kick Point. We're a digital marketing agency way up in the frozen north of Edmonton, Alberta. We work with a lot of local businesses, both in Edmonton and around the world, and small local businesses usually have the same questions when it comes to reporting.

Are we making money?

What I'm going to share with you today is our local marketing dashboard that we share with clients. We build this in Google Data Studio because we love Google Data Studio. If you haven't watched my Whiteboard Friday yet on how to do formulas in Google Data Studio, I recommend you hit Pause right now, go back and watch that, and then come back to this because I am going to talk about what happened there a little bit in this video.

The Google Data Studio dashboard

This is a Google Data Studio dashboard which I've tried to represent in the medium of whiteboard as best as I could. Picture it being a little bit better design than my left-handedness can represent on a whiteboard, but you get the idea. Every local business wants to know, "Are we making money?" This is the big thing that people care about, and really every business cares about making money. Even charities, for example: money is important obviously because that's what keeps the lights on, but there's also perhaps a mission that they have.

But they still want to know: Are people filling out our donation form? Are people contacting us? These are important things for every business, organization, not-for-profit, whatever to understand and know. What we've tried to do in this dashboard is really boil it down to the absolute basics, one thing you can look at, see a couple of data points, know whether things are good or things are bad.

Are people contacting you?

Let's start with this up here. The first thing is: Are people contacting you? Now you can break this out into separate columns. You can do phone calls and emails for example. Some of our clients prefer that. Some clients just want one mashed up number. So we'll take the number of calls that people are getting.

If you're using a call tracking tool, such as CallRail, you can import this in here. Emails, for example, or forms, just add it all together and then you have one single number of the number of times people contacted you. Usually this is a way bigger number than people think it is, which is also kind of cool.

Are people taking the action you want them to take?

The next thing is: Are people doing the thing that you want them to do? This is really going to decide on what's meaningful to the client.

For example, if you have a client, again thinking about a charity, how many people filled out your donation form, your online donation form? For a psychologist client of ours, how many people booked an appointment? For a client of ours who offers property management, how many people booked a viewing of a property? What is the thing you want them to do? If they have online e-commerce, for example, then maybe this is how many sales did you have.

Maybe this will be two different things — people walking into the store versus sales. We've also represented in this field if a person has a people counter in their store, then we would pull that people counter data into here. Usually we can get the people counter data in a Google sheet and then we can pull it into Data Studio. It's not the prettiest thing in the world, but it certainly represents all their data in one place, which is really the whole point of why we do these dashboards.

Where did visitors com from, and where are your customers coming from?

People contacting you, people doing the thing you want them to do, those are the two major metrics. Then we do have a little bit deeper further down. On this side here we start with: Where did visitors come from, and where are your customers coming from? Because they're really two different things, right? Not every visitor to the website is going to become a customer. We all know that. No one has a 100% conversion rate, and if you do, you should just retire.

Filling out the dashboard

We really need to differentiate between the two. In this case we're looking at channel, and there probably is a better word for channel. We're always trying to think about, "What would clients call this?" But I feel like clients are kind of aware of the word "channel" and that's how they're getting there. But then the next column, by default this would be called users or sessions. Both of those are kind of cruddy. You can rename fields in Data Studio, and we can call this the number of people, for example, because that's what it is.

Then you would use the users as the metric, and you would just call it number of people instead of users, because personally I hate the word "users." It really boils down the humanity of a person to a user metric. Users are terrible. Call them people or visitors at least. Then unfortunately, in Data Studio, when you do a comparison field, you cannot rename and call it comparison. It does this nice percentage delta, which I hate.

It's just like a programmer clearly came up with this. But for now, we have to deal with it. Although by the time this video comes out, maybe it will be something better, and then I can go back and correct myself in the comments. But for now it's percentage delta. Then goal percentage and then again delta. They can sort by any of these columns in Data Studio, and it's real live data.

Put a time period on this, and people can pick whatever time period they want and then they can look at this data as much as they want, which is delightful. If you're not delivering great results, it may be a little terrifying for you, but really you shouldn't be hiding that anyway, right? Like if things aren't going well, be honest about it. That's another talk for another time. But start with this kind of chart. Then on the other side, are you showing up on Google Maps?

We use the Supermetrics Google My Business plug-in to grab this kind of information. We hook it into the customer's Google Maps account. Then we're looking at branded searches and unbranded searches and how many times they came up in the map pack. Usually we'll have a little explanation here. This is how many times you came up in the map pack and search results as well as Google Maps searches, because it's all mashed in together.

Then what happens when they find you? So number of direction requests, number of website visits, number of phone calls. Now the tricky thing is phone calls here may be captured in phone calls here. You may not want to add these two pieces of data or just keep this off on its own separately, depending upon how your setup is. You could be using a tracking number, for example, in your Google My Business listing and that therefore would be captured up here.

Really just try to be honest about where that data comes from instead of double counting. You don't want to have that happen. The last thing is if a client has messages set up, then you can pull that message information as well.

Tell your clients what to do

Then at the very bottom of the report we have a couple of columns, and usually this is a longer chart and this is shorter, so we have room down here to do this. Obviously, my drawing skills are not as good as as aligning things in Data Studio, so forgive me.

But we tell them what to do. Usually when we work with local clients, they can't necessarily afford a monthly retainer to do stuff for clients forever. Instead, we tell them, "Here's what you have to do this month.Here's what you have to do next month. Hey, did you remember you're supposed to be blogging?" That sort of thing. Just put it in here, because clients are looking at results, but they often forget the things that may get them those results. This is a really nice reminder of if you're not happy with these numbers, maybe you should do these things.

Tell your clients how to use the report

Then the next thing is how to use. This is a good reference because if they only open it say once every couple months, they probably have forgotten how to do the stuff in this report or even things like up at the top make sure to set the time period for example. This is a good reminder of how to do that as well.

Because the report is totally editable by you at any time, you can always go in and change stuff later, and because the client can view the report at any time, they have a dashboard that is extremely useful to them and they don't need to bug you every single time they want to see a report. It saves you time and money. It saves them time and money. Everybody is happy. Everybody is saving money. I really recommend setting up a really simple dashboard like this for your clients, and I bet you they'll be impressed.

Thanks so much.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

October 10, 2018

Why We’re Doubling Down on the Future of SEO – Moz + STAT

Posted by Dr-Pete

Search is changing. As a 200-person search marketing software company, this isn't just a pithy intro – it's a daily threat to our survival. Being an organic search marketer can be frustrating when even a search like "What is SEO?" returns something like this...

...or this...

...or even this...

So, why don't we just give up on search marketing altogether? If I had to pick just one answer, it's this – because search still drives the lion's share of targeted, relevant traffic to business websites (and Google drives the vast majority of that traffic, at least in the US, Canada, Australia, and Western Europe).

We have to do everything better

The answer isn't to give up – it's to recognize all of this new complexity, study it, and do our jobs better. Earlier this year, for example, we embarked on a study to understand how SERP features impact click-through rates (CTR). It turns out to be a difficult problem, but even the initial insights of the data were useful (and a bit startling). For example, here's the average organic (SERPs with no features) curve from that study...

Various studies show the starting point at various places, but the shape itself is consistent and familiar. We know, though, that reducing everything to one average ignores a lot. Here's a dramatic example. Let's compare the organic curve to the curve for SERPs with expanded sitelinks (which are highly correlated with dominant and/or branded intent)...

Results with sitelinks in the #1 position have a massive 80% average CTR, with a steep drop to #2. These two curves represent two wildly different animals. Now, let's look at SERPs with Knowledge Cards (AKA "answer boxes" – Knowledge Graph entities with no organic link)...

The CTR in the #1 organic position drops to almost 1/3 of the organic-only curve, with corresponding drops throughout all positions. Organic opportunity on these SERPs is severely limited.

Opportunity isn't disappearing, but it is evolving. We have to do better. This is why Moz has teamed up with STAT, and why we're doubling down on search. We recognize the complexity of SERP analytics in 2018, but we also truly believe that there's real opportunity for those willing to do the hard work and build better tools.

Doubling down on RANKINGS

It hurts a bit to admit, but there's been more than once in the past couple of years where a client outgrew Moz for rank tracking. When they did, we had one thing to say to those clients: "We'll miss you, and you should talk to STAT Search Analytics." STAT has been a market leader in daily rank tracking, and they take that job very seriously, with true enterprise-scale capabilities and reporting.

For the past couple of years, STAT's team has also been a generous source of knowledge, and even as competitors our engineering teams have shared intel on Google's latest changes. As of now, all brakes are off, and we're going to dive deep into each other's brains (figuratively, of course – I only take mad science so far) to find out what each team does best. We're going to work to combine the best of STAT's daily tracking technology with Moz's proprietary metrics (such as Keyword Difficulty) to chart the future of rank tracking.

We'll also be working together to redefine what "ranking" means, in an organic sense. There are multiple SERP features, from Featured Snippets to Video Carousels to People Also Ask boxes that represent significant organic opportunity. STAT and Moz both have a long history of researching these opportunities and recognize the importance of reflecting them in our products.

Doubling down on RESEARCH

One area Moz has excelled at, showcased in the launch and evolution of Keyword Explorer, is keyword research. We'll be working hard to put that knowledge to work for STAT customers even as we evolve Moz's own toolsets. We're already doing work to better understand keyword intent and how it impacts keyword research – beyond semantically related keywords, how do you find the best keywords with local intent or targeted at the appropriate part of the sales funnel? In an age of answer engines, how do you find the best questions to target? Together, we hope to answer these questions in our products.

In August, we literally doubled our keyword corpus in Keyword Explorer to supercharge your keyword research. You can now tap into suggestions from 160 million keywords across the US, Canada, UK, and Australia.

Beyond keywords, Moz and STAT have both been market leaders in original industry research, and we'll be stronger together. We're going to have access to more data and more in-house experts, and we'll be putting that data to work for the search industry.

Doubling down on RESULTS

Finally, we recognize that SERP analytics are much more than just a number from 1–50. You need to understand how results drive clicks, traffic, and revenue. You need to understand your competitive landscape. You need to understand the entire ecosystem of keywords, links, and on-page SEO, and how those work together. By combining STAT's enterprise-level analytics with Moz's keyword research, link graph, and technical SEO tools (including both Site Crawl and On-demand Crawl), we're going to bring you the tools you need to demonstrate and drive bottom-line results.

In the short-term, we're going to be listening and learning from each other, and hopefully from you (both our community and our customers). What's missing in your search marketing workflow today? What data do you love in Moz or STAT that's missing from the other side? How can we help you do your job better? Let us know in the comments.

If you'd like to be notified of future developments, join our Moz+STAT Search Analytics mailing list (sign-up at bottom of page) to find out about news and offers as we roll them out.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

October 10, 2018

Moz Acquires STAT Search Analytics: We’re Better Together!

Posted by SarahBird

We couldn't be more thrilled to announce that Moz has acquired STAT Search Analytics!

It’s not hard to figure out why, right? We both share a vision around creating search solutions that will change the industry. We're both passionate about investing in our customers’ success. Together we provide a massive breadth of high-quality, actionable data and insights for marketers. Combining Moz’s SEO research tools and local search expertise with STAT’s daily localized rankings and SERP analytics, we have the most robust organic search solution in the industry.

I recently sat down with my friend Rob Bucci, our new VP of Research & Development and most recently the CEO of STAT, to talk about how this came to be and what to expect next. Check it out:

You can also read Rob's thoughts on everything here over on the STAT blog!

With our powers combined...

Over the past few months, Moz’s data has gotten some serious upgrades. Notably, with the launch of our new link index in April, the data that feeds our tools is now 35x larger and 30x fresher than it was before. In August we doubled our keyword corpus and expanded our data for the UK, Canada, and Australia, positioning us to lead the market in keyword research and link building tools. Throughout 2018, we’ve made significant improvements to Moz Local’s UI with a brand-new dashboard, making sure our business listing accuracy tool is as usable as it is useful. Driving the blood, sweat, and tears behind these upgrades is a simple purpose: to provide our customers with the best SEO tools money can buy.

STAT is intimately acquainted with this level of customer obsession. Their team has created the best enterprise-level SERP analysis software on the market. More than just rank tracking, STAT’s data is a treasure trove of consumer research, competitive intel, and the deep search analytics that enable SEOs to level up their game.

Moz + STAT together provide a breadth and depth of data that hasn’t existed before in our industry. Organic search shifts from tactics to strategy when you have this level of insight at your disposal, and we can’t wait to reveal what industry-changing products we’ll build together.

Our shared values and vision

Aside from the technology powerhouse this partnership will build, we also couldn’t have found a better culture fit than STAT. With values like selflessness, ambition, and empathy, STAT embodies TAGFEE. Moz and STAT are elated to be coming together as a single company dedicated to developing the best organic search solutions for our customers while also fostering an awesome culture for our employees.

Innovation awaits!

To Moz and STAT customers: the future is bright. Expect more updates, more innovation, and more high-quality data at your disposal than ever before. As we grow together, you’ll grow with us.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

October 9, 2018

The SEO Cyborg: How to Resonate with Users & Make Sense to Search Bots

Posted by alexis-sanders

SEO is about understanding how search bots and users react to an online experience. As search professionals, we’re required to bridge gaps between online experiences, search engine bots, and users. We need to know where to insert ourselves (or our teams) to ensure the best experience for both users and bots. In other words, we strive for experiences that resonate with humans and make sense to search engine bots.

This article seeks to answer the following questions:

  • How do we drive sustainable growth for our clients?
  • What are the building blocks of an organic search strategy?

What is the SEO cyborg?

A cyborg (or cybernetic organism) is defined as “a being with both organic and
biomechatronic body parts, whose physical abilities are extended beyond normal human limitations by mechanical elements.”

With the ability to relate between humans, search bots, and our site experiences, the SEO cyborg is an SEO (or team) that is able to work seamlessly between both technical and content initiatives (whose skills are extended beyond normal human limitations) to support driving of organic search performance. An SEO cyborg is able to strategically pinpoint where to place organic search efforts to maximize performance.

So, how do we do this?

The SEO model

Like so many classic triads (think: primary colors, the Three Musketeers, Destiny’s Child [the canonical version, of course]) the traditional SEO model, known as the crawl-index-rank method, packages SEO into three distinct steps. At the same time, however, this model fails to capture the breadth of work that we SEOs are expected to do on a daily basis, and not having a functioning model can be limiting. We need to expand this model without reinventing the wheel.

The enhanced model involves adding in a rendering, signaling, and connection phase.

You might be wondering, why do we need these?:

  • Rendering: There is increased prevalence of JavaScript, CSS, imagery, and personalization.
  • Signaling: HTML <link> tags, status codes, and even GSC signals are powerful indicators that tell search engines how to process and understand the page, determine its intent, and ultimately rank it. In the previous model, it didn’t feel as if these powerful elements really had a place.
  • Connecting: People are a critical component of search. The ultimate goal of search engines is to identify and rank content that resonates with people. In the previous model, “rank” felt cold, hierarchical, and indifferent towards the end user.

All of this brings us to the question: how do we find success in each stage of this model?

Note: When using this piece, I recommend skimming ahead and leveraging those sections of the enhanced model that are most applicable to your business’ current search program.

The enhanced SEO model

Crawling

Technical SEO starts with the search engine’s ability to find a site’s webpages (hopefully efficiently).

Finding pages

Initially finding pages can happen a few ways, via:

  • Links (internal or external)
  • Redirected pages
  • Sitemaps (XML, RSS 2.0, Atom 1.0, or .txt)

Side note: This information (although at first pretty straightforward) can be really useful. For example, if you’re seeing weird pages popping up in site crawls or performing in search, try checking:

  • Backlink reports
  • Internal links to URL
  • Redirected into URL

Obtaining resources

The second component of crawling relates to the ability to obtain resources (which later becomes critical for rendering a page’s experience).

This typically relates to two elements:

  1. Appropriate robots.txt declarations
  2. Proper HTTP status code (namely 200 HTTP status codes)

Crawl efficiency

Finally, there’s the idea of how efficiently a search engine bot can traverse your site’s most critical experiences.

Action items:

  • Is site’s main navigation simple, clear, and useful?
  • Are there relevant on-page links?
  • Is internal linking clear and crawlable (i.e., <a href="/">)?
  • Is an HTML sitemap available?
    • Side note: Make sure to check the HTML sitemap’s next page flow (or behavior flow reports) to find where those users are going. This may help to inform the main navigation.
  • Do footer links contain tertiary content?
  • Are important pages close to root?
  • Are there no crawl traps?
  • Are there no orphan pages?
  • Are pages consolidated?
  • Do all pages have purpose?
  • Has duplicate content been resolved?
  • Have redirects been consolidated?
  • Are canonical tags on point?
  • Are parameters well defined?

Information architecture

The organization of information extends past the bots, requiring an in-depth understanding of how users engage with a site.

Some seed questions to begin research include:

  • What trends appear in search volume (by location, device)? What are common questions users have?
  • Which pages get the most traffic?
  • What are common user journeys?
  • What are users’ traffic behaviors and flow?
  • How do users leverage site features (e.g., internal site search)?

Rendering

Rendering a page relates to search engines’ ability to capture the page’s desired essence.

JavaScript

The big kahuna in the rendering section is JavaScript. For Google, rendering of JavaScript occurs during a second wave of indexing and the content is queued and rendered as resources become available.

Image based off of Google I/O ’18 presentation by Tom Greenway and John Mueller, Deliver search-friendly JavaScript-powered websites

As an SEO, it’s critical that we be able to answer the question — are search engines rendering my content?

Action items:

  • Are direct “quotes” from content indexed?
  • Is the site using <a href="/"> links (not onclick();)?
  • Is the same content being served to search engine bots (user-agent)?
  • Is the content present within the DOM?
  • What does Google’s Mobile-Friendly Testing Tool’s JavaScript console (click “view details”) say?

Infinite scroll and lazy loading

Another hot topic relating to JavaScript is infinite scroll (and lazy load for imagery). Since search engine bots are lazy users, they won’t scroll to attain content.

Action items:

Ask ourselves – should all of the content really be indexed? Is it content that provides value to users?

  • Infinite scroll: a user experience (and occasionally a performance optimizing) tactic to load content when the user hits a certain point in the UI; typically the content is exhaustive.

Solution one (updating AJAX):

1. Break out content into separate sections

  • Note: The breakout of pages can be /page-1, /page-2, etc.; however, it would be best to delineate meaningful divides (e.g., /voltron, /optimus-prime, etc.)

2. Implement History API (pushState(), replaceState()) to update URLs as a user scrolls (i.e., push/update the URL into the URL bar)

3. Add the <link> tag’s rel="next" and rel="prev" on relevant page

Solution two (create a view-all page)
Note: This is not recommended for large amounts of content.

1. If it’s possible (i.e., there’s not a ton of content within the infinite scroll), create one page encompassing all content

2. Site latency/page load should be considered

  • Lazy load imagery is a web performance optimization tactic, in which images loads upon a user scrolling (the idea is to save time, downloading images only when they’re needed)
  • Add <img> tags in <noscript> tags
  • Use JSON-LD structured data
    • Schema.org "image" attributes nested in appropriate item types
    • Schema.org ImageObject item type

CSS

I only have a few elements relating to the rendering of CSS.

Action items:

  • CSS background images not picked up in image search, so don’t count on for important imagery
  • CSS animations not interpreted, so make sure to add surrounding textual content
  • Layouts for page are important (use responsive mobile layouts; avoid excessive ads)

Personalization

Although a trend in the broader digital exists to create 1:1, people-based marketing, Google doesn’t save cookies across sessions and thus will not interpret personalization based on cookies, meaning there must be an average, base-user, default experience. The data from other digital channels can be exceptionally useful when building out audience segments and gaining a deeper understanding of the base-user.

Action item:

  • Ensure there is a base-user, unauthenticated, default experience

Technology

Google’s rendering engine is leveraging Chrome 41. Canary (Chrome’s testing browser) is currently operating on Chrome 69. Using CanIUse.com, we can infer that this affects Google’s abilities relating to HTTP/2, service workers (think: PWAs), certain JavaScript, specific advanced image formats, resource hints, and new encoding methods. That said, this does not mean we shouldn’t progress our sites and experiences for users — we just must ensure that we use progressive development (i.e., there’s a fallback for less advanced browsers [and Google too ☺]).

Action items:

  • Ensure there's a fallback for less advanced browsers

Indexing

Getting pages into Google’s databases is what indexing is all about. From what I’ve experienced, this process is straightforward for most sites.

Action items:

  • Ensure URLs are able to be crawled and rendered
  • Ensure nothing is preventing indexing (e.g., robots meta tag)
  • Submit sitemap in Google Search Console
  • Fetch as Google in Google Search Console

Signaling

A site should strive to send clear signals to search engines. Unnecessarily confusing search engines can significantly impact a site’s performance. Signaling relates to suggesting best representation and status of a page. All this means is that we’re ensuring the following elements are sending appropriate signals.

Action items:

  • <link> tag: This represents the relationship between documents in HTML.
    • Rel="canonical": This represents appreciably similar content.
      • Are canonicals a secondary solution to 301-redirecting experiences?
      • Are canonicals pointing to end-state URLs?
      • Is the content appreciably similar?
        • Since Google maintains prerogative over determining end-state URL, it’s important that the canonical tags represent duplicates (and/or duplicate content).
      • Are all canonicals in HTML?
      • Is there safeguarding against incorrect canonical tags?
    • Rel="next" and rel="prev": These represent a collective series and are not considered duplicate content, which means that all URLs can be indexed. That said, typically the first page in the chain is the most authoritative, so usually it will be the one to rank.
    • Rel="alternate"
      • media: typically used for separate mobile experiences.
      • hreflang: indicate appropriate language/country
        • The hreflang is quite unforgiving and it’s very easy to make errors.
        • Ensure the documentation is followed closely.
        • Check GSC International Target reports to ensure tags are populating.
  • HTTP status codes can also be signals, particularly the 304, 404, 410, and 503 status codes.
    • 304 – a valid page that simply hasn’t been modified
    • 404 – file not found
    • 410 – file not found (and it is gone, forever and always)
    • 503 – server maintenance

  • Google Search Console settings: Make sure the following reports are all sending clear signals. Occasionally Google decides to honor these signals.
    • International Targeting
    • URL Parameters
    • Data Highlighter
    • Remove URLs
    • Sitemaps

Rank

Rank relates to how search engines arrange web experiences, stacking them against each other to see who ends up on top for each individual query (taking into account numerous data points surrounding the query).

Two critical questions recur often when understanding ranking pages:

  • Does or could your page have the best response?
  • Are you or could you become semantically known (on the Internet and in the minds of users) for the topics? (i.e., are you worthy of receiving links and people traversing the web to land on your experience?)

On-page optimizations

These are the elements webmasters control. Off-page is a critical component to achieving success in search; however, in an idyllic world, we shouldn’t have to worry about links and/or mentions – they should come naturally.

Action items:

  • Textual content:
    • Make content both people and bots can understand
    • Answer questions directly
    • Write short, logical, simple sentences
    • Ensure subjects are clear (not to be inferred)
    • Create scannable content (i.e., make sure <h#> tags are an outline, use bullets/lists, use tables, charts, and visuals to delineate content, etc.)
    • Define any uncommon vocabulary or link to a glossary
  • Multimedia (images, videos, engaging elements):
    • Use imagery, videos, engaging content where applicable
    • Ensure that image optimization best practices are followed
  • Meta elements (<title> tags, meta descriptions, OGP, Twitter cards, etc.)
  • Structured data

Image courtesy of @abbynhamilton

  • Is content accessible?
    • Is there keyboard functionality?
    • Are there text alternatives for non-text media? Example:
      • Transcripts for audio
      • Images with alt text
      • In-text descriptions of visuals
    • Is there adequate color contrast?
    • Is text resizable?

Finding interesting content

Researching and identifying useful content happens in three formats:

  • Keyword and search landscape research
  • On-site analytic deep dives
  • User research

Visual modified from @smrvl via @DannyProl

Audience research

When looking for audiences, we need to concentrate high percentages (super high index rates are great, but not required). Push channels (particularly ones with strong targeting capabilities) do better with high index rates. This makes sense, we need to know that 80% of our customers have certain leanings (because we’re looking for base-case), not that five users over-index on a niche topic (these five niche-topic lovers are perfect for targeted ads).

Some seed research questions:

  • Who are users?
  • Where are they?
  • Why do they buy?
  • How do they buy?
  • What do they want?
  • Are they new or existing users?
  • What do they value?
  • What are their motivators?
  • What is their relationship w/ tech?
  • What do they do online?
  • Are users engaging with other brands?
    • Is there an opportunity for synergy?
  • What can we borrow from other channels?
    • Digital presents a wealth of data, in which 1:1, closed-loop, people-based marketing exists. Leverage any data you can get and find useful.

Content journey maps

All of this data can then go into creating a map of the user journey and overlaying relevant content. Below are a few types of mappings that are useful.

Illustrative user journey map

Sometimes when trying to process complex problems, it’s easier to break it down into smaller pieces. Illustrative user journeys can help with this problem! Take a single user’s journey and map it out, aligning relevant content experiences.

Funnel content mapping

This chart is deceptively simple; however, working through this graph can help sites to understand how each stage in the funnel affects users (note: the stages can be modified). This matrix can help with mapping who writers are talking to, their needs, and how to push them to the next stage in the funnel.

Content matrix

Mapping out content by intent and branding helps to visualize conversion potential. I find these extremely useful for prioritizing top-converting content initiatives (i.e., start with ensuring branded, transactional content is delivering the best experience, then move towards more generic, higher-funnel terms).

Overviews

Regardless of how the data is broken down, it’s vital to have a high-level view on the audience’s core attributes, opportunities to improve content, and strategy for closing the gap.

Connecting

Connecting is all about resonating with humans. Connecting is about understanding that customers are human (and we have certain constraints). Our mind is constantly filtering, managing, multitasking, processing, coordinating, organizing, and storing information. It is literally in our mind’s best interest to not remember 99% of the information and sensations that surround us (think of the lights, sounds, tangible objects, people surrounding you, and you’re still able to focus on reading the words on your screen — pretty incredible!).

To become psychologically sticky, we must:

  1. Get past the mind’s natural filter. A positive aspect of being a pull marketing channel is that individuals are already seeking out information, making it possible to intersect their user journey in a micro-moment.
  2. From there we must be memorable. The brain tends to hold onto what’s relevant, useful, or interesting. Luckily, the searcher’s interest is already piqued (even if they aren’t consciously aware of why they searched for a particular topic).

This means we have a unique opportunity to “be there” for people. This leads to a very simple, abstract philosophy: a great brand is like a great friend.

We have similar relationship stages, we interweave throughout each other’s lives, and we have the ability to impact happiness. This comes down to the question: Do your online customers use adjectives they would use for a friend to describe your brand?

Action items:

  • Is all content either relevant, useful, or interesting?
  • Does the content honor your user’s questions?
  • Does your brand have a personality that aligns with reality?
  • Are you treating users as you would a friend?
  • Do your users use friend-like adjectives to describe your brand and/or site?
  • Do the brand’s actions align with overarching goals?
  • Is your experience trust-inspiring?
  • https://?
  • Using Limited ads in layout?
  • Does the site have proof of claims?
  • Does the site use relevant reviews and testimonials?
  • Is contact information available and easily findable?
  • Is relevant information intuitively available to users?
  • Is it as easy to buy/subscribe as it is to return/cancel?
  • Is integrity visible throughout the entire conversion process and experience?
  • Does site have credible reputation across the web?

Ultimately, being able to strategically, seamlessly create compelling user experiences which make sense to bots is what the SEO cyborg is all about. ☺

tl;dr

  • Ensure site = crawlable, renderable, and indexable
  • Ensure all signals = clear, aligned
  • Answering related, semantically salient questions
  • Research keywords, the search landscape, site performance, and develop audience segments
  • Use audience segments to map content and prioritize initiatives
  • Ensure content is relevant, useful, or interesting
  • Treat users as friend, be worthy of their trust

This article is based off of my MozCon talk (with a few slides from the Appendix pulled forward). The full deck is available on Slideshare, and the official videos can be purchased here. Please feel free to reach out with any questions in the comments below or via Twitter @AlexisKSanders.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!