September 27, 2018

Surprising SEO A/B Test Results – Whiteboard Friday

Posted by willcritchlow

You can make all the tweaks and changes in the world, but how do you know they're the best choice for the site you're working on? Without data to support your hypotheses, it's hard to say. In this week's edition of Whiteboard Friday, Will Critchlow explains a bit about what A/B testing for SEO entails and describes some of the surprising results he's seen that prove you can't always trust your instinct in our industry.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, everyone. Welcome to another British Whiteboard Friday. My name is Will Critchlow. I'm the founder and CEO at Distilled. At Distilled, one of the things that we've been working on recently is building an SEO A/B testing platform. It's called the ODN, the Optimization Delivery Network. We're now deployed on a bunch of big sites, and we've been running these SEO A/B tests for a little while. I want to tell you about some of the surprising results that we've seen.

What is SEO A/B testing?

We're going to link to some resources that will show you more about what SEO A/B testing is. But very quickly, the general principle is that you take a site section, so a bunch of pages that have a similar structure and layout and template and so forth, and you split those pages into control and variant, so a group of A pages and a group of B pages.

Then you make the change that you're hypothesizing is going to make a difference just to one of those groups of pages, and you leave the other set unchanged. Then, using your analytics data, you build a forecast of what would have happened to the variant pages if you hadn't made any changes to them, and you compare what actually happens to the forecast. Out of that you get some statistical confidence intervals, and you get to say, yes, this is an uplift, or there was no difference, or no, this hurt the performance of your site.

This is data that we've never really had in SEO before, because this is very different to running a controlled experiment in a kind of lab environment or on a test domain. This is in the wild, on real, actual, live websites. So let's get to the material. The first surprising result I want to talk about is based off some of the most basic advice that you've ever seen.

Result #1: Targeting higher-volume keywords can actually result in traffic drops

I've stood on stage and given this advice. I have recommended this stuff to clients. Probably you have too. You know that process where you do some keyword research and you find that there's one particular way of searching for whatever it is that you offer that has more search volume than the way that you're talking about it on your website right now, so higher search volume for a particular way of phrasing?

You make the recommendation, "Let's talk about this stuff on our website the way that people are searching for it. Let's put this kind of phrasing in our title and elsewhere on our pages." I've made those recommendations. You've probably made those recommendations. They don't always work. We've seen a few times now actually of testing this kind of process and seeing what are actually dramatic drops.

We saw up to 20-plus-percent drops in organic traffic after updating meta information in titles and so forth to target the more commonly-searched-for variant. Various different reasons for this. Maybe you end up with a worse click-through rate from the search results. So maybe you rank where you used to, but get a worse click-through rate. Maybe you improve your ranking for the higher volume target term and you move up a little bit, but you move down for the other one and the new one is more competitive.

So yes, you've moved up a little bit, but you're still out of the running, and so it's a net loss. Or maybe you end up ranking for fewer variations of key phrases on these pages. However it happens, you can't be certain that just putting the higher-volume keyword phrasing on your pages is going to perform better. So that's surprising result number one. Surprising result number two is possibly not that surprising, but pretty important I think.

Result #2: 30–40% of common tech audit recommendations make no difference

So this is that we see as many as 30% or 40% of the common recommendations in a classic tech audit make no difference. You do all of this work auditing the website. You follow SEO best practices. You find a thing that, in theory, makes the website better. You go and make the change. You test it.

Nothing, flatlines. You get the same performance as the forecast, as if you had made no change. This is a big deal because it's making these kinds of recommendations that damages trust with engineers and product teams. You're constantly asking them to do stuff. They feel like it's pointless. They do all this stuff, and there's no difference. That is what burns authority with engineering teams too often.

This is one of the reasons why we built the platform is that we can then take our 20 recommendations and hypotheses, test them all, find the 5 or 6 that move the needle, only go to the engineering team to build those ones, and that builds so much trust and relationship over time, and they get to work on stuff that moves the needle on the product side.

So the big deal there is really be a bit skeptical about some of this stuff. The best practices, at the limit, probably make a difference. If everything else is equal and you make that one tiny, little tweak to the alt attribute or a particular image somewhere deep on the page, if everything else had been equal, maybe that would have made the difference.

But is it going to move you up in a competitive ranking environment? That's what we need to be skeptical about.

Result #3: Many lessons don't generalize

So surprising result number three is: How many lessons do not generalize? We've seen this broadly across different sections on the same website, even different industries. Some of this is about the competitive dynamics of the industry.

Some of it is probably just the complexity of the ranking algorithm these days. But we see this in particular with things like this. Who's seen SEO text on a category page? Those kind of you've got all of your products, and then somebody says, "You know what? We need 200 or 250 words that mention our key phrase a bunch of times down at the bottom of the page." Sometimes, helpfully, your engineers will even put this in an SEO-text div for you.

So we see this pretty often, and we've tested removing it. We said, "You know what? No users are looking at this. We know that overstuffing the keyword on the page can be a negative ranking signal. I wonder if we'll do better if we just cut that div." So we remove it, and the first time we did it, plus 6% result. This was a good thing.

The pages are better without it. They're now ranking better. We're getting better performance. So we say, "You know what? We've learnt this lesson. You should remove this really low-quality text from the bottom of your category pages." But then we tested it on another site, and we see there's a drop, a small one admittedly, but it was helping on these particular pages.

So I think what that's just telling us is we need to be testing these recommendations every time. We need to be trying to build testing into our core methodologies, and I think this trend is only going to increase and continue, because the more complex the ranking algorithms get, the more machine learning is baked into it and it's not as deterministic as it used to be, and the more competitive the markets get, so the narrower the gap between you and your competitors, the less stable all this stuff is, the smaller differences there will be, and the bigger opportunity there will be for something that works in one place to be null or negative in another.

So I hope I have inspired you to check out some SEO A/B testing. We're going to link to some of the resources that describe how you do it, how you can do it yourself, and how you can build a program around this as well as some other of our case studies and lessons that we've learnt. But I hope you enjoyed this journey on surprising results from SEO A/B tests.

Resources:

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

September 26, 2018

The E-Commerce Benchmark KPI Study: The Most Valuable Online Consumer Trend of 2018 Revealed

Posted by Alan_Coleman

The latest Wolfgang E-Commerce Report is now live. This study gives a comprehensive view of the state of digital marketing in retail and travel, allowing digital marketers to benchmark their 2018 performance and plan their 2019 strategy.

The study analyzes over 250 million website sessions and more than €500 million in online revenue. Google Analytics, new Facebook Analytics reports, and online surveys are used to glean insights.

Revenue volume correlations

One of the unique features of the study is its conversion correlation. All website metrics featured in the study are correlated with conversion success to reveal what the most successful websites do differently.

This year we've uncovered our strongest success correlation ever at 0.67! Just to give that figure context: normally, 0.2 is worth talking about and 0.3 is noteworthy. Not only is this correlation with success very strong, the insight itself is highly actionable and can become a pillar of your digital marketing strategy.

These are the top factors that correlated with revenue volume. You can see the other correlations in the full study.

Click to see a bigger version

  • Average pages per session (.37)
  • Average session length (.49)
  • Conversion rate by users (.41)
  • Number of sessions per user (.67)
  • Percentage of sessions from paid search (.25)

Average website engagement metrics

Number of sessions per user Average pages per session Average session duration Bounce rate Average page load time Average server response time
Retail 1.58 6 3min 18sec 38.04% 6.84 1.02
Multi-channel 1.51 6 3min 17sec 35.27% 6.83 1.08
Online-only 1.52 5 3min 14sec 43.80% 6.84 0.89
Travel 1.57 3 2min 34sec 44.14% 6.76 0.94
Overall 1.58 5 3min 1sec 41.26% 6.80 0.97

Above are the average website engagement metrics. You can see the average number of sessions per user is very low at 1.5 over 12 months. Anything a digital marketer can do to get this to 2, to 3, and to 4 makes for about the best digital marketing they can do.

At Wolfgang Digital, we’ve been witnessing this phenomenon at a micro-level for some time now. Many of our most successful campaigns of late have been focused on presenting the user with an evolving message which matures with each interaction across multiple media touchpoints.

Click through to the Wolfgang E-Commerce KPI Report in full to uncover dozens more insights, including:

  • Is a social media engagement more valuable than a website visit?
  • What's the true value of a share?
  • What’s the average conversion rate for online-only vs multi-channel retailers?
  • What’s the average order value for a hotel vs. tour operator?

Video Transcript

Today I want to talk to you about the most important online consumer trend in 2018. The story starts in a client meeting about four years ago, and we were meeting with a travel client. We got into a discussion about bounce rate and its implication on conversion rate. The client was asking us, "could we optimize our search and social campaigns to reduce bounce rate?", which is a perfectly valid question.

But we were wondering: Will we lower the rate of conversions? Are all bounces bad? As a result of this meeting, we said, "You know, we need a really scientific answer to that question about any of the website engagement metrics or any of the website channels and their influence on conversion." Out of that conversation, our E-Commerce KPI Report was born. We're now four years into it. (See previous years on the Moz Blog: 2015, 2016, 2017.)

The metric with the strongest correlation to conversions: Number of sessions per user

We've just released the 2019 E-Commerce KPI Report, and we have a standout finding, probably the strongest correlation we've ever seen between a website engagement metric and a website conversion metric. This is beautiful because we're all always optimizing for conversion metrics. But if you can isolate the engagement metrics which deliver, which are the money-making metrics, then you can be much more intelligent about how you create digital marketing campaigns.

The strongest correlation we've ever seen in this study is number of sessions per user, and the metric simply tells us on average how many times did your users visit your website. What we're learning here is any digital marketing you can do which makes that number increase is going to dramatically increase your conversions, your revenue success.

Change the focus of your campaigns

It's a beautiful metric to plan campaigns with because it changes the focus. We're not looking for a campaign that's a one-click wonder campaign. We're not looking for a campaign that it's one message delivered multiple times to the same user. Much more so, we're trying to create a journey, multiple touchpoints which deliver a user from their initial interaction through the purchase funnel, right through to conversion.

Create an itinerary of touchpoints along the searcher's journey

1. Research via Google

Let me give you an example. We started this with a story about a travel company. I'm just back from a swimming holiday in the west of Ireland. So let's say I have a fictional travel company. We'll call them Wolfgang Wild Swimming. I'm going to be a person who's researching a swimming holiday. So I'm going to go to Google first, and I'm going to search for swimming holidays in Ireland.

2. E-book download via remarketing

I'm going to go to the Wolfgang Wild Swimming web page, where I'm going to read a little bit about their offering. In doing that, I'm going to enter their Facebook audience. The next time I go to Facebook, they're now remarketing to me, and they'll be encouraging me to download their e-book, which is a guide to the best swimming spots in the wild west of Ireland. I'm going to volunteer my email to them to get access to the book. Then I'm going to spend a bit more time consuming their content and reading their book.

3. Email about a local offline event

A week later, I get an email from them, and they're having an event in my area. They're going for a swim in Dublin, one of my local spots in The Forty Foot, for example. I'm saying, "Well, I was going to go for a swim this weekend anyway. I might as well go with this group." I go to the swim where I can meet the tour guides. I can meet people who have been on it before. I'm now really close to making a purchase.

4. YouTube video content consumed via remarketing

Again, a week later, they have my email address, so they're targeting me on YouTube with videos of previous holidays. Now I'm watching video content. All of a sudden, Wolfgang Wild Swimming comes up. I'm now watching a video of a previous holiday, and I'm recognizing the instructors and the participants in the previous holidays. I'm really, really close to pressing Purchase on a holiday here. I'm on the phone to my friend saying, "I found the one. Let's book this."

Each interaction moves the consumer closer to purchase

I hope what you're seeing there is with each interaction, the Google search, the Facebook ad which led to an e-book download, the offline event, back online to the YouTube video, with each interaction I'm getting closer to the purchase.

You can imagine the conversion rate and the return on ad spend on each interaction increasing as we go. This is a really powerful message for us as digital marketers. When we're planning a campaign, we think about ourselves as though we're in the travel business too, and we're actually creating an itinerary. We're simply trying to create an itinerary of touchpoints that guide a searcher through awareness, interest, right through to action and making that purchase.

I think it's not just our study that tells us this is the truth. A lot of the best-performing campaigns we've been running we've seen this anecdotally, that every extra touchpoint increases the conversion rate. Really powerful insight, really useful for digital marketers when planning campaigns. This is just one of the many insights from our E-Commerce KPI Report. If you found that interesting, I'd urge you to go read the full report today.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

September 20, 2018

Spectator to Partner: Turn Your Clients into SEO Allies – Whiteboard Friday

Posted by KameronJenkinsAre your clients your allies in SEO, or are they passive spectators? Could they even be inadvertently working against you? A better understanding of expectations, goals, and strategy by everyone involved can improve your client ...
September 20, 2018

Spotting Low-Hanging Fruits with an SEO Dashboard

If you want to report on the success of your SEO campaigns within your company, and you want to develop a data-driven SEO strategy, ... The post Spotting Low-Hanging Fruits with an SEO Dashboard appeared first on Searchmetrics SEO Blog.