MetaFilter is a news and discussion site dating back to the early years of the web. Founded in 1999, it’s attracting attention this week after coming forward about how a Google penalty has severely harmed its business, to the degree of having to let staff go. It serves as a poster child of problems with Google’s penalty process, despite all the advances Google has made over the years.
If you just can’t wait to get to the specifics of what happened to MetaFilter, scroll further down in this story now. But I’d encourage you to go through the background explanation I’m going to cover. It’s important.
Stories such as what’s happening with MetaFilter aren’t new. Google’s penalties have hit sites small and large for years. But often when those sites are hit, there’s something in them that doesn’t draw a great deal of sympathy.
You can (and I have) dig into some “small business” that claims to have done absolutely nothing wrong only to discover they’d been buying links or doing other things that many would agree were unsavory. Last week, I spent several hours looking into one such case that at first seemed all innocent but turned out to have layers and layers of garbage.
As for big businesses, after many called for Google to do something about “content farms,” it responded with its Panda Update in 2011. Many web commentators cheered that Demand Media properties were slaughtered by Panda rather than decried that Google was somehow monopolistic or too powerful. They were sick of content from Demand Media properties like eHow showing up, and good riddance seemed to be a widespread response.
MetaFilter is different. It serves as the elusive “poster child” of potential wrong-doing by Google, a venerable site that many internet old-timers remember fondly as a place to discover things before we had Twitter and Facebook and Reddit and BuzzFeed to barrage us with quizzes and LOLs and “you won’t believe what happened” articles and the like.
MetaFilter is in trouble with Google? To the degree it has to let people go! And because it may have had too many ads? Or has some type of unnatural linking?
Sure. MetaFilter has all the elements to be that poster child, that innocent that shouldn’t have been punished, starting with the most important concern. It can be damn hard to know what the hell has gone wrong with your site, if Google doesn’t like it.
It’s important to understand that if you’re in trouble with the Google law, you’ll effectively get a penalty — a Google traffic ticket — in one of two ways.
The first is what’s called a “manual action.” Google prefers to say “action” rather than what most who are in the SEO space call it is: a manual penalty. That’s when an actual human being at Google has decided something is wrong with a page, several pages or even your entire site, and that page, group of pages or entire site will no longer rank as well as in the past, until the problem is fixed.
A few years ago, you might not have even realized you were given such a penalty. But these days, Google says it reports virtually all such manual actions, sending notices through its Google Webmaster Tools.
The chief challenge with manual actions these days, in terms of correcting them, is that Google sometimes fails to deliver enough information about what’s wrong. It’s like a cop who says, “Do you know why I pulled you over?” Unfortunately, this cop may leave you guessing about whether it’s a busted tail-light, or if you ran a stop sign or maybe that you were speeding. You just get a ticket with no guidance.
The second way you get penalized is through what’s called an algorithmic action (to use Google’s own terminology) or an algorithmic penalty. Google really doesn’t like to call these penalties, thinking of them instead as adjustments.
Adjustments? Yeah, really just ensuring that pages are being ranked where they should be ranked, rather than being somehow rewarded with a higher position they don’t deserve.
From Google’s point of view, it’s like this. Google has a basic algorithm, a software process that when you type in the word “travel” sorts through billions of pages and tries to figure out which are the best and order them from most relevant to least.
That algorithm uses lots and lots of different “signals” to figure all this out. It’s not just “PageRank” as you may have heard. Our Periodic Table Of SEO Success Factors is a simplified version of the many signals — and even that has plenty to list.
The problem is that the general algorithm can be challenged by particular problems. That’s where all these “filters” or ”updates” with animal names like “Panda” or “Penguin” come in. The algorithm is the basic engine that sorts through stuff; the filters are like parts added to the engine to catch particulates in the fuel, oil or air that the engine can’t handle so well.
To Google, these filters are all extensions of the core algorithm, extensions of a process that’s not meant to penalize anyone but rather rank things the way Google thinks they should be ranked. But the nature of how these filters get applied can certainly act and feel like a penalty, to the degree that they should properly be called that.
For example, take the Panda filter. Basically, Google “pours” all the sites it knows through the Panda filter in hopes of catching those it believes have “thin” content. Here’s a little clip of a presentation I did last year to illustrate this more:
Anything caught by that filter won’t do as well as it previously did.
To “escape” Panda, the site has to change the things that Panda doesn’t like. Then it has to wait until the next time Google applies the filter (which has tended to be monthly). If it changed enough, it may start ranking better. If it hasn’t changed enough, it stays trapped.
Google is also constantly changing its various filters, which may free-up false positives or trap things that were missed before.
In short, if you’re hit by an automatic penalty, you need to fix what that penalty targets and then wait until filter is applied again. That sounds easy, but it’s not. That’s because unlike with manual actions, Google doesn’t reveal which automatic penalty hit you.
This week provides a perfect example. If you suddenly experienced a drop in traffic this week, a really noticeable drop, then you might have been hit by the latest version of Panda that rolled out. Or maybe you were hit by the latest version of the Payday Update that rolled out. You’d better know which one, because each has its own fix.
Panda goes after “thin” content that’s deemed not to have enough substance to it. The solution is to often drop the thin content or bulk it up with real material. Payday goes after people who are deemed to be violating Google’s guidelines with outright spam, especially queries with a lot of spam activity, such as for “payday loans” or pornographic terms.
If you were hit this week by either, Google didn’t tell you. There was no notification in Google Webmaster Tools saying “Hello, you were hit by Panda — here’s what to do!” Savvy publishers up on SEO (who read us here) probably can guess they were hit by an update and may figure out what to do. Others just see a traffic plunge and have no idea why.
I’ve long lobbied that Google should tell people if they’re hit by one of these updates, and I’ll reiterate that today. You can’t fix things that are broken if you don’t know what’s wrong. Google should say when an important update happens (it doesn’t always reveal these) and let publishers know if they’re being hit by one of the major penalties.
At last, it’s time to dig into what happened with MetaFilter. The short story is that I don’t know. MetaFilter doesn’t know. Google knows, and so far, it’s not saying. But we can still do some analysis plus learn how difficult it is sometimes for a publisher to solve a Google penalty, if hit.
On Monday, MetaFilter’s founder Matt Haughey posted that in the Fall 2012, the site suffered a mysterious decline in Google traffic. He wrote:
A year and a half ago, we woke up one day to see a 40% decrease in revenue and traffic to Ask MetaFilter, likely the result of ongoing Google index updates. We scoured the web and took advice of reducing ads in the hopes traffic would improve but it never really did, staying steady for several months and then periodically decreasing by smaller amounts over time.
As a result, revenue dropped to the degree that the site is now running at a loss, which means layoffs for three moderators.
On Wednesday, Haughey posted more details. Some key points:
He also shared this chart of the traffic decline:
He sums up:
Since we’ve never seen a return to our pre-Fall 2012 traffic levels, I have to assume whatever hidden law we broke we’re still breaking, or that Google sees us as a home for comment spam even though we boot every single one we can find though a series of sophisticated methods, and the whole experience has been frustrating to say the least. At this point, I’m at wits end trying to figure out why our high-quality site, featuring good advice from a dedicated community of real people with a best-in-industry 24-hour moderation staff has seen such big decreases.
I bolded the key point, the key issue in all of this. Many things can be debated, but I think few would debate that people shouldn’t have to puzzle over why Google suddenly dislikes them. It should just say, even to the outright spammers (and it does, even to them, in cases of manual actions).
Now, I’ve been in touch briefly with Haughey and gotten the exact date of MetaFilter’s decline: November 17, 2012. I’m hoping to learn more from him, as well as from Google (it often won’t talk about specific sites), but that date alone is enough to raise concerns about how Google has approached this site.
I’d asked Haughey if he’d gotten any manual penalty notices, and he didn’t report any back to me, so this looks like automatic penalty. Since it hit the Ask Metafilter section, that’s the type of content that’s typically been slammed by Panda updates, help content that can often seem too “thin” to Google. But that date doesn’t line up precisely with any Panda updates. Nor does it line-up with any other algorithm filter that we know went live near that time:
Penguin is a filter that goes after spam activity, in particular sites with bad links. MetaFilter’s traffic loss came well after one of the Penguin updates, so it’s really not a problem with links that it was having. If it was a Penguin problem, MetaFilter would have seen a drop on October 5, 2012 or very close to that date.
Top Heavy is a filter that goes after sites that Google deems to be too top heavy with ads. But again, MetaFilter’s traffic loss came well after this happened. So while Haughey says he got advice online to pull back on ads, that was advice to fix a problem MetaFilter didn’t seem to have. Otherwise, it would have been hit by Top Heavy.
As for Panda, while that’s the likely candidate for the type of site MetaFilter is, it was hit with a decrease between two Panda updates, not aligned with one or the other. And if it really was a Panda update that hit MetaFilter, it would be much more aligned with one or the other of these. That’s how it’s supposed to work.
Possibly, MetaFilter was hit by some other algorithmic penalty that Google never publicly reported. Only Google would know that, or know if it really was some type of early or delayed Panda update (the Nov. 21, 2012 update might indeed have happened earlier, despite Google officially having denied that). And that’s the problem. If only Google knows, then a publisher like Haughey doesn’t know what to fix.
As explained, MetaFilter doesn’t appear to have been hit by Top Heavy, so the pullback on ads (all ads from Google’s own AdSense program, by the way) might have been unnecessary.
If it was somehow Panda that hit, it is true that ads can have an impact there. But they’re a minor point of the advice Google has given, only one of the “23 Questions To Ask Yourself” if hit by Panda that Google gave out soon after Panda first appeared — and not mentioned at all in a Panda recovery advice video that Google put out last fall.
Nor does Ask MetaFilter seem as if it was that ad-heavy. Unfortunately, I can’t go back and find examples of how it looked easily. But I can compare a current page on the site now, with three ads, to an Internet Archive copy from November 2011, before the ad removals happened. Then, it had five ads — which if they included link units, would have been within Google’s current policy. Only one of those ads was at the top of the page, hardly making it “top heavy” with ads. Nor for the amount of content on that page would five ads have seemed excessive.
Unless there’s something escaping me, from the evidence I can see from afar, the ads really shouldn’t have made a difference here.
One of the infuriating things that’s been happening over the past two years, as Google launched a renewed campaign against “unnatural links,” is the “link walk of shame” that it has been forcing people to do.
I can’t recall if I coined that term or heard it from someone else on a panel I was moderating, at one of our SMX conferences. But I love it, because it well describes how Google — whether it gives a manual or automatic penalty to a publisher over unnatural links — wants that publisher to try and manually scrub those links as much as possible from the web.
It’s insane because it has allowed the same sites that charged people to get links to now charge for them to be removed. Or for the rise of an entire link debuilding industry. Or for publishers who have long suffered terrible link requests to now get messages from people asking for links to be removed.
I went through this recently where someone who had spammed a comment onto our site (which was still there, because of an issue with our commenting system) and wanted us to remove it. They’d paid to use a toolset that went around finding all their links and allowed them to effectively bulk send such requests out to publishers. As I tweeted in frustration:
If you thought automated link requests were bad, automated link removal requests are worse. Because that's what I'm getting now. Sigh.
— Danny Sullivan (@dannysullivan) May 12, 2014
So when Haughey wrote of his dismay in getting similar requests, that resonates with me. I mean, look at this mess of requests from his inbox:
But worse, he was getting some of these requests because Google itself was telling other publishers they had links from his site that were bad. That’s not just generating annoying link removal emails for him; that’s also suggesting that Google doesn’t trust his site.
The truth is more likely that it’s less to do with his site and more to do with Google seeing a pattern of what it considers to be unnatural linking by the publishers it is contacting. As Haughey guesses, it’s more likely that MetaFilter is “collateral damage,” with the damage being the annoying link removal requests rather than any real distrust of the site. That’s especially since its ranking drop seems to well predate these requests.
But it is worrisome, especially when Haughey writes about how much care apparently goes into trying to ensure all links are relevant:
Google calling links found on MetaFilter “inorganic” is troubling. We have a staff of six full-time moderators in five timezones throughout the world (two are in Europe) to make sure zero spam ends up on the site.
We have a variety of internal tools that help us track all spam down: views of all activity by new users that contain links and lists of comments added to old questions that had a link in them (two patterns we found comment spammers trying in the past)….
We have a total of over ten million comments across on all our sites combined and we spend so much time and energy tracking the few problem comments down that I would be hard-pressed to find even a single public comment that could be considered comment spam.
Every time I investigate these “unnatural link” claims, I find a comment by a longtime member of MetaFilter in good standing trying to help someone out, usually trying to identify something on Ask MetaFilter. In the course of explaining things, they’ll often do a search for examples of what they’re describing and include those for people asking a question.
Whatever was #1 in Google for “crawlspace vent covers” in a question of “How to reduce heating costs in the Winter?” might show up, and now years later, the owners of sites that actively gamed Google to get that #1 spot at the time are trying to clean up their act but unfortunately I have a feeling MetaFilter is suffering as collateral damage in the process.
Google has a link disavow tool that allows anyone who has “unnatural links” to effectively tell Google not to count them. Google should just let people use that, without causing innocent publishers to get caught up in these link walks of shame. Better, if Google is smart enough to know a link isn’t deemed “natural,” then just don’t count it.
I’m not a regular frequenter of MetaFilter. For all I know, the site might actually be spam-riddled in ways that are hard for a newcomer to see. But that’s not the take I get from David Auerbach over at Slate, who is more familiar with it and went through a number of pages to come to this conclusion:
MetaFilter is one of the best-moderated sites on the Internet, and it’s tragic if it’s losing revenue to mistaken deranking, when linkbait and song-lyric sites frequently clog up the higher rankings.
It feels like MetaFilter was caught up in a Panda filter, despite the oddity of being slammed between them. It also feels like it’s a false positive that shouldn’t have been hit that way. Here’s hoping that Google takes a fast, closer look at the situation.
Clearly it is:
@ElliotJH MattH and I have been discussing it over the last week or so.
— Matt Cutts (@mattcutts) May 21, 2014
That tweet is from the head of Google’s web spam team Matt Cutts, who notes that he’s been discussing the issue with Haughey “over the last week or so.” That means Haughey has been getting some advice about the situation from as far up as you can go, when it comes to Google penalties — and before he went public with the issue.
It’s also not something Haughey has shared in either of his posts. Knowing what exactly was he told by Cutts would be helpful to outsiders trying to figure out if MetaFilter was unfairly hit.
For other publishers who aren’t MetaFilter, who aren’t going to get this type of attention, the situation is less positive. The might flounder along without any outcry. But these general things would be a big help:
As for publishers worried about Google penalties, here’s the most important advice you can have to protect yourself with:
If you’re an essential resource or brand (big or small), Google will come under fire by its users if they can’t find you. Or, your loyal visitors will speak up, if they feel Google isn’t treating you well.
MetaFilter isn’t getting special attention because of its Google problem. Google is getting attention because of its MetaFilter problem — a problem that Google might be mistreating a well-known site that’s earned respect over time.
Earn respect. That’s your best defense if things go south with Google. It’s also your best offense for doing well in Google.
I’ll conclude by saying that it sucks when people are hit by unfair penalties. It’s also bad when people are hit by penalties and uncertain what even went wrong. If Google’s going to give out tickets, tell people exactly how to fix things.
But I’ll also say there is a lot of garbage people should be thankful that Google keeps out of its search results, not just “thin” pages but pages full of malware or stuff that is actively trying to mislead people in ways that many would agree is terrible.
It’s incredibly easy to focus on Google’s mistakes, to pick the poster child and assume that it represents how everything Google does is wrong. It’s easy to pick a small site hit and say this is Google going after the little business. It’s easy to rally to a banner of Google has too much power, especially as the once small and seemingly well-meaning company Google was has evolved into a sometimes ruthless competitor.
But I’m one for perspective. The reality is for all the evil people sometimes want to portray Google as being, this is a search engine that still seems to send huge numbers of people away from its site, rather than trying to just recirculate them within Google properties. That many businesses have gained, grown and thrived off Google traffic without ever paying for it. That for all the poster children of wrong-doing that get attention, there are many more that have benefited that you don’t hear about, because no one complains about things going well.
That doesn’t excuse Google for getting things wrong. That doesn’t mean it’s correct about whatever the case is with MetaFilter or other sites it has issues with. That doesn’t mean the company doesn’t have much more it could do to improve. But perspective is useful, because making such decisions as a search engine is hard. Really hard — and if you want to better understand, I recommend this article and video: Could You Walk In Google’s Shoes? Making Tough Calls With Search Listings.
And if you haven’t yet read enough, here’s a billion background articles with more:
Postscript: The “Then Again, MetaFilter Has Gotten Some Unmentioned Help From Google” section of this was added a few hours after the original post, when I saw Cutts make his tweet. I hadn’t seen that (and had looked) when writing the original piece.