Meet The Search Engines: Up Close @SMX West 2015

gary-duane-1500

The themes that kept arising at SMX West 2015 last week were mobile and security. So, going into the “Meet The Search Engines” panel with Bing’s Duane Forrester and Google’s Gary Illyes, we knew what to expect. Or so we thought.

Search Engine Land founding editor Danny Sullivan, who served as moderator and SMX emcee, kept the conversation light and entertaining as always, but unlike many such sessions in the past, we actually learned some interesting things from Gary and Duane.

Signals & Quality

It was interesting that Duane was the one to talk about “signals” and “multiple layers of signals” rather than the Google rep. He mentioned that Bing looks at a domain holistically and that it understands some sites have thin or throwaway content (such as event pages for events that have passed or products that are out of stock) by nature.

He said Bing looks at signals from the domain overall in these cases and that if there is enough good content and/or strong quality signals to the domain overall, that will often override the data about the thin or throwaway content. I always suspected this was the case, but it was nice to have it confirmed.

Gary neglected to comment on how Google handles situations like this, but nodded his head in agreement. Whether that agreement means that this is the way it should be done or that this is the way Google does it… only Google knows for sure.

Out Of Stock Or Unavailable Products

Another interesting point of contention was in how sites should handle those throwaway pages.

If a product goes out of stock and may come back, most site owners might include a note on that page or 302 it to another page (more on that 302 in a moment).

If the product is gone and not coming back, both Bing and Google recommend that you 404 it (more on that in a moment, as well). Where the engines differed dramatically, however, was in how they suggested you inform the search engine about it.

XML Sitemaps – Signals Or Definitive?

Duane said you should immediately drop those 404 pages from your XML Sitemaps. He said “Bing wants your sitemaps to be clean.” In the past, he’s been quoted as saying that Bing will actually ignore sitemaps that aren’t clean; that they “lose trust in them.”

Gary on the other hand, said that Google would like you to leave those URLs in your XML sitemaps until they are dropped from the index, and then Google wants them removed from the XML sitemaps. Since this represents a significant additional challenge for most larger sites, I reached out to him with the following question, but haven’t heard back (I’ll update if I do):

You suggested including 404 pages in your xml sitemap if you want to get rid of them quickly. Since most people have automated sitemaps that don’t include 404s, would a temporary sitemap added to the index file with only 404s in it be appropriate? Do you have another suggestion?

My hunch is that this would be a perfectly acceptable way to handle it, and likely much easier for the webmaster to manage.

404 Vs. 410?

There was also some contention about how Google and Bing handle 404 (Not Found) vs. 410 (Gone) signals.

Bing was definitive, with Duane saying 410 is considered a definitive signal since it takes additional effort on the webmaster’s part and is unlikely to happen accidentally. He said Bing would check a 404 a few times to make sure that’s what the site owner really meant to do, but that 410 would drop a page from the index immediately.

Gary said Google handles it the same way, but Danny pointed out (and showed him) where Google says this:

Currently, Google treats 410s (Gone) the same as 404s (Not found).

Gary reacted with surprise and asked Danny to email that link to him.

301 Vs 302?

Again, Duane was very definitive on how Bing handles 302s. He said Bing will access a 302 five times and then, if it is still there, it will assume you meant to do a 301 and pass the link value over.

Gary chose not to answer this question.

In my own experience, Google treats 302s the same as 301s ever since the 302 status code was updated with HTTP 1.1 to mean “Found” instead of “Moved Temporarily.” Google never changed its documentation on this, either, but I have seen several instances in which link equity was passed through a 302, and John Mueller of Google has been quoted as saying 302s pass link equity. Again, I reached out to Gary for comment, but haven’t heard back.

IP Experience, Bounceback Signals, Patents & Instant Panda Updates

While Duane and Gary both were very clear that what they wanted us to take away from the session was that MOBILE IS HERE and SECURITY IS THE FUTURE (emphasis was theirs), they dropped a few other little nuggets of information:

  • The most interesting to me is that Bing uses signals like what I call bounceback (where someone clicks on a result in Bing and then uses the back button to come right back to the search results) as a quality signal. I always expected this was the case on both Google and Bing; but to my knowledge, neither of them had ever confirmed it. While Gary didn’t confirm that Google, his body language (a smile and slight nod of the head) indicated that’s likely the case.
  • Gary at one point said something like “it happens pretty much instantly,” when talking about Panda, but I think that quote has been taken out of context. What I heard was not that Panda is happening instantly or that if you fix your Panda issues you’ll pretty much instantly see results as others are quoting. What I think he meant was that once your site is clear of Panda (due to the filter refreshing), you can “pretty much instantly” expect your site to show improvement in results . I requested that Gary clarify this, as well, but on this particular point, I don’t expect him to provide a response.
  • Google is working on being able to crawl from other locations to see site customizations targeted toward groups of IP addresses. For example, if you’re located in California and (as Danny said) you want the page to appear more “Californian,” you may choose to change out a cityscape for an ocean view or something. Google is working on being able to detect that type of geo-location personalization. Duane said Bing doesn’t really care about that; it is mostly just looking at the end state of the content. I found it funny that the subtext of that, as I read it, was “go ahead and play with your silly technical toys; we’ll focus on the user.” But that was just my interpretation; Duane was polite, as always.
  • The last thing that bears mentioning (although it actually came rather early in the conversation) was that Gary warned SEOs about monitoring Google’s patents. He said the search engine doesn’t use more than half of what it has patent applications or patents for, and that sometimes even the examples used aren’t factual. This, to me, read more like a “don’t believe everything you read” warning, which most of us don’t anyway. Still, I don’t think it means Bill Slawski (Google patent reader and examiner) is out of a job!

The Next Big Thing

Danny wrapped up the (far too short!) session by asking Duane and Gary what the next big thing is. They both agreed: wearables and personal digital assistants – beyond the capability of what we see now with Google Now, Siri and Cortana.

Duane recommended the movie Transcendence as a way to see what life would be like soon which (according to Danny and a show of hands) either no one in the room had seen or they were too embarrassed to admit it. I haven’t seen it, but if Duane loves it, I’ll give it a try. Danny said it was horrible.

Gary recommended the movie Her, saying this is truly what life will be like soon. He reiterated the importance of April 21 (the launch of the mobile algorithm). There was a nod of agreement from everyone, thanks from Danny and the audience, and that was a wrap.

The post Meet The Search Engines: Up Close @SMX West 2015 appeared first on Search Engine Land.