Yelp may once again feel Google is robbing it of its fair share of search traffic, with a study that’s been leaked to TechCrunch to prove it. A close read of that study actually shows that it proves the opposite — or at least, that it’s certainly not as damning as it sounds.
Leaked Documents Show How Yelp Thinks It’s Getting Screwed By Google is the story over at TechCrunch out today with the study. I’ve taken my own headline from that to do this contrarian view.
Let me say from the start that Google does a lot of things that I think puts it in conflict with publishers. How Google Went From Search Engine To Content Destination is a story I did about two years ago getting into this in more depth.
The story I’m writing today isn’t to say that Google is innocent of conflicting its role of being a search engine with that of being a destination. Rather, I’m just looking at the study’s allegations in particular. If we’re going to damn Google for doing something wrong, and perhaps subject it to government regulation, let’s hope the evidence being used to lobby for such a move stands up.
In this case, it doesn’t.
Yelp seems to have wanted to determine how much traffic Google might be “siphoning” away from Yelp by somehow promoting Google+ Local in Google’s own listings. Yelp either conducted its own user behavior study or, more likely, contracted to have one done.
The study is summarized in these slides that TechCrunch obtained and shared on Scribd. Core to it is the contention that if people are explicitly looking for Yelp content on Google, Google is snagging them away.
To prove this, a single search is studied. One single search, for “gary danko yelp.” Gary Danko is a restaurant, and the assumption seems to be anyone who wants a Yelp review about the restaurant would type that phrase in (as opposed to, perhaps, “gary danko yelp review”).
The slides illustrate the culprit believed to be pulling users away — links to Google’s own content that appears below the listing for the restaurant:
The study recruited people online, using Amazon’s Mechanical Turk service, to participate. It wanted to only use people who knew of Yelp as a review site, so it screened anyone that responded. If they didn’t choose Yelp as a review site among a variety of possible answers, they were rejected. Of the total pool, 48% were rejected as unfamiliar with Yelp.
Of those that were left, they were asked how they’d locate reviews of the Gary Danko restaurant by measuring their clicks, in the study run on June 21 and 22.
This leads to the first major flaw in the study. They weren’t given a choice of going directly to the Yelp website or perhaps using the Yelp app, as you might expect people who are familiar with Yelp wanting to do.
Rather, they had only one choice — to search at Google, in order to determine what they’d click on there. From the study:
The study found that clicks on the search results went to Yelp content 80% of the time:
That’s the second big takeaway. It’s an odd argument to make that Google is somehow imposing so much harm against your business when, even for a search involving your name, it’s sending you 80% of the clicks. If Google’s uber-plan was to steal Yelp’s traffic, you’d think it would concoct a better way to do it.
How about the other clicks, the remaining 20%? The study says 19% of these went to the “Google+ Local Space” while 1% went to “Other.” That sounds like, as the study concludes when rounding up, that Google is somehow taking 20% of clicks that somehow belong to Yelp.
The reality is that Google actually took at most 3 clicks for itself in the study or 1.4% of the total clicks. It might have been less. To understand this, consider this close-up of the area the study calls the “Google+ Local Space,” shown below:
I made that screenshot from doing the same search the study did. The listing is the same as you can see from the study screenshot further above. The study counted any click in this area as a Google+ Local click, regardless of whether the clicks actually led to content on Google+, Google Maps or instead to the restaurant itself.
Now let’s look at the click map from the study, which recorded 38 clicks in this area:
The red dots should be the clicks. As you can see, virtually all of the clicks are on the listing for the restaurant itself. Despite this, the study somehow wants to count those as going to Google+.
Moreover, the Google+ links that come in below the restaurant’s listings, which the study seems to want to argue are key to somehow taking traffic from Yelp, have only one click. No one actually clicked on the link for Google+ reviews. No one clicked on the link for the restaurant’s Google+ page. One single click happened on the “Write a review” link. That’s the extent of Google+ clicks generated by this listing.
Of these clicks, two others didn’t go to Google or the restaurant. One is in the upper right corner and appears to be someone clicking on nothing, though perhaps it was registering a click on an image a bit further to the right. The other was a click on a drop-down box that opens up for the words “Gary Danko” to provide a description of the site from the Open Directory.
The study also counted two clicks that happened in the Knowledge Graph box, to the right of the main search results, as being Google+ clicks:
As it turns out, these are clicks that might not have done anything. At the top, the click happens between the “Directions” and “Write a review” box. Maybe this was a misregister and one of those boxes was selected. Further down, what looks like a click to see Google+ reviews actually wouldn’t have made them appear, not if the person clicked on the stars as shown. Those won’t make the reviews load — try it yourself. But maybe, again, this was a misregister for a click on the reviews link to the right.
At most, I count three clicks to Google+ content — not the 40 the study suggests, 1.4% of the total, not the 19% that it later rounds up to a 20% “siphonage” that it says Yelp has suffered:
That figure is also used to suggest that this siphonage happens for a variety of navigational searches. A single study of one search used to define the supposed stealing off traffic for all similar searches — and using an inflated number, at that.
Two other things from that concluding slide. First, despite all this supposed siphoning of traffic, the conclusion is that none of this has “negatively impacted our overall traffic.”
The slide also says that Google “appears to be intentionally serving up search results that contradict the users’ intent to preference Google+.” Actually, as I’ve explained, Google is putting the actual restaurant above Yelp, not Google+.
Still, is it doing this to “contradict” user intent “intentionally?” Sure, maybe Google went out of its way to hard-code something into its search engine to ensure that whenever someone searches for anything+yelp, it should put something else above that.
More likely, Google’s algorithm is weighing the words “Gary Danko” too heavily, given them preference and returning the restaurant ahead of Yelp when the clicks suggest users want Yelp first.
That seems even more likely when you consider this:
Here, for a generic search on “gary danko reviews,” Google isn’t putting its own content at the top of the results, in the area that the studies generally show pull the most clicks. No, it’s Yelp that gets top billing, followed by TripAdvisor. Google’s own Zagat reviews site comes third. Google+ reviews is a small link tucked under the fourth listing.
How about searches for “gary danko” on its own? Look here:
The restaurant gets top billing. Google+ does get an associated link which, if Yelp’s study is any guide, might not actually get many clicks. Yelp gets a big second place listing.
As for someone who searches for “gary danko yelp reviews,” they get this:
In that example, Yelp gets the first four reviews.
If Google’s aim was to really siphon traffic away from Yelp, there are better ways it could be doing it, especially off of these types of searches.
It’s really important to understand that so far, Yelp hasn’t used this particular study to lob accusations at Google. It’s easy to slip into that assumption. In writing this, I kept having to correct myself from initially writing “Yelp says” to “the study says” and so on. Hopefully, I’ve caught all that. Perhaps Yelp will make use of this, however. If so, the study deserves much more scrutiny.
I’m also only working with a summary of the study, so perhaps there are details I’m missing, which might shape my analysis differently.
I’ll also stress again that it’s not that Google doesn’t have issues and conflicts between its role in being a search engine and its aspirations to be a content destination. It does. But this study isn’t showing some nefarious plot against Yelp. Rather, it suggests that despite everything, Google’s doing a pretty decent job of ensuring that Yelp gets plenty of free traffic despite Yelp being its competitor.
Finally, we’ve contacted both Google and Yelp about the study. If we get comments, we’ll follow up with a postscript or a fresh story.