Posted by Dom-Woodman
My current obsession has been reporting. Everyone could benefit from paying more attention to it. Five years, countless ciders, and too many conferences into my career, I finally spent some time on it.
Bad reporting soaks up just as much time as pointless meetings. Analysts spend hours creating reports that no one will read, or making dashboards that never get looked it. Bad reporting means people either focus on the wrong goals, or they pick the right goals, but choose the wrong way to measure them. Either way, you end up in the same place.
So I thought I'd share what I’ve learned.
We’re going to split this into:
(We’ll lean on SEO examples — we’re on Moz! — however, for those non-SEO folks, the principles are the same.)
The action you take off a dashboard should be:
Example questions a dashboard would answer:
The action you take off a report should be:
Example questions a report would answer:
This context will inform many of our decisions. We care about our audience, because they all know and care about very different things.
A C-level executive doesn’t care about keyword cannibalization, but probably does care about the overall performance of marketing. An SEO manager, on the other hand, probably does care about the number of pages indexed and keyword cannibalization, but is less bothered by the overall performance of marketing.
If someone tells you the report is for audiences with obviously different decision levels, then you’re almost always going to end up creating something that won’t fulfill the goals we talked about above. Split up your reporting into individual reports/dashboards for each audience, or it will be left ignored and unloved.
How do you know what your audience will care about? Ask them. As a rough guide, you can assume people typically care about:
But seriously. Ask them what they care about.
Asking them is particularly important, because you don’t just need to understand your audience — you may also need to educate them. To go back on myself, there are in fact CEOs who will care about specific keywords.
The problem is, they shouldn’t. And if you can’t convince them to stop caring about that metric, their incentives will be wrong and succeeding in search will be harder. So ask. Persuading them to stop using the wrong metrics is, of course, another article in and of itself.
To continue that point, now is also the time to get initial agreement that these dashboards/reports will be what’s used to measure performance.
That way, when they email you three months in asking how you’re doing for keyword x, you’re covered.
The question you’re answering with a dashboard is usually quite simple. It's often some version of:
...where x is a general goal, not a metric. The difference here is that a goal is the end result (e.g. a fast website), and the metric (e.g. time to start render) is the way of measuring progress against that.
This is the hard part. We’re defining our goal by the metrics we choose to measure it by.
A good metric is typically a direct measure of success. It should ideally have no caveats that are outside your control.
No caveats? Ask yourself how you would explain if the number went down. If you can immediately come up with excuses that could be answered by things out of your control, then you should try to refine this metric. (Don’t worry, there's an example in the next section.)
We also need to be sure that it will create incentives for how people behave.
Unlike a report, which will be used to help us make a decision, a dashboard is showing the goals we care about. It’s a subtle distinction, but an important one. A report will help you make a single decision. A dashboard and the KPIs it shows will define the decisions and reports you create and the ideas people have. It will set incentives and change how the people working off it behave. Choose carefully. Avinash has my back here; go read his excellent article on choosing KPIs.
You need to bear both of these in mind when choosing metrics. You typically want only one or two metrics per goal to avoid being overwhelming.
Goal: Measure the success of organic performance
Who is it for: SEO manager
The goal we’re measuring and the target audience are sane, so now we need to pick a metric.
We’ll start with a common metric that I often hear suggested and we’ll iterate on it until we’re happy. Our starting place is:
You might have to compromise your metric depending on resources. What we’ve just talked through is an ideal. Adjusting for industry, for example, is typically quite hard; you might have to settle for showing Google trends for some popular terms on a second graph, or showing Hitwise industry data on another graph.
Watch out if you find yourself adding more than one or two additional metrics. When you get to three or four, information gets difficult to parse at glance.
What about incentives? The metric we settled on will incentivize our team get more traffic, but it doesn’t have any quality control.
We could succeed at our goal by aiming for low-quality traffic, which doesn’t convert or care about our brand. We should consider adding a second metric, perhaps revenue attributed to search with linear attribution, smoothed seasonality, and a 90-day lookback. Or alternatively, organic non-bounce sessions with smoothed seasonality (using adjusted bounce rate).
Both those metrics sound like a bit of a mouthful. That’s because they’ve gone through a process similar to what we talked about above. We might’ve started with revenue attributed to search before, then got more specific and ended up with revenue attributed to search with linear attribution, smoothed seasonality and a 90-day lookback.
Remember, a dashboard shouldn’t try to explain why performance was bad (based on things in your control). A dashboard's job is to track a goal over time and says whether or not further investigation is needed.
The goal here is to convey our information as quickly and easily as possible. It should be eyeball-able.
Creating a good dashboard layout:
Here’s a really basic example I mocked up for this post, based on the section above:
A report needs to be able to help us make a decision. Picking the goal for a dashboard is typically quite simple. Choosing the decision our report is helping us make is usually a little more fraught. Most importantly, we need to decide:
If you don’t have a decision in mind, if you’re just creating a report to dig into things, then you’re wasting time. Don’t make a report.
If the decision is to prioritize next month, then you could have an investigative report designed to help you prioritize. But the goal of the report isn’t to dig in — it's to help you make a decision. This is primarily a frame of mind, but I think it’s a crucial one.
Once we’ve settled on the decision, we then:
Here’s an example decision a client suggested to me recently:
Are we happy with this decision? In this case, I wasn’t. Experience has taught me that SEO very rarely runs week to week; one thing our SEO split-testing platform has taught us time and time again is even obvious improvements can take three to four weeks to result in significant traffic change.
Great — we’re now happy with our decision, so let’s start listing possible factors. For the sake of brevity, I’m only going to include three here:
1. Individual keyword rankings
Conclusion: Yes, we should include keyword rankings, but they need to be grouped and ideally also have both rank with and without Google features. We’ll also want to avoid averaging rank, to lose subtlety in how our keywords are moving amongst each other. This example graph from STAT illustrates this well:
2. Individual keyword clicks
Conclusion: I would probably say no. We’re only looking at organic performance here and clicks will be subject to seasonality and industry trends that aren’t related to our organic performance. There are certainly click metrics that will be useful that we haven’t gone over in these examples — this just isn’t one of them.
3. Number of indexed pages
Conclusion: Probably yes. The automation will be a pain, but it will be relatively easy to pull it in manually once a month. It won’t change anyone's mind very often, so it won’t be put at the forefront of a report, but it’s a useful additional piece of information that’s very quick to scan and will help us rule something out.
Again, our layout should be fit for the goal we’re trying to achieve, which gives us a couple principles to follow:
The graphs themselves are crucial elements of a report and dashboard. People have built entire careers out of helping people visualize data on graphs. Rather than reinvent the wheel, the following resources have all helped me avoid the worst when it comes to graphs.
Both #1 and #2 below don’t focus on making things pretty, but rather on the goal of a graph: to let you process data as quickly as possible.
Sometimes (read: nearly always) you’ll be limited by the programs you work in, but it’s good to know the ideal, even if you can’t quite reach it.
Well, we got to the end of the article and I’ve barely even touched on how to practically make dashboards/reports. Where are the screenshots of the Google Data Studio menus and the step-by-step walkthroughs? Where’s the list of tools? Where’s the explanation on how to use a Google Sheet as a temporary database?
Those are all great questions, but it’s not where the problem lies.
We need to spend more time thinking about the content of reports and what they're being used for. It’s possible having read this article you’ll come away with the determination to make fewer reports and to trash a whole bunch of your dashboards.
That’s fantastic. Mission accomplished.
There are good tools out there (I quite like Plot.ly and Google Data Studio) which make generating graphs easier, but the problem with many of the dashboards and reports I see isn’t that they’ve used the Excel default colors — it’s that they haven’t spent enough time thinking about the decision the report makes, or picking the ideal metric for a dashboard.
Let’s go out and think more about our reports and dashboards before we even begin making them.
What do you guys think? Has this been other people's experience? What are the best/worst reports and dashboards you’ve seen and why?
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!