Millions of us have asked Google to play weatherperson. In order to get an answer to one of our oldest and most basic questions, we type a variety of words and numbers — city names, neighborhoods and ZIP codes, plus words like “weather,” “forecast” or maybe “temperature” — into the Google search box, and Google typically has an answer back to us in less than a second.
But have you noticed that our weather-related searches can be more complex now than ever?
It’s true. In recent years, Google has gone from focusing on the keywords we use to figuring out the meaning of our search queries. That’s essentially what Google’s Hummingbird algorithm is about, although – as I’ll show below — this started happening in weather searches before Google announced Hummingbird.
As a Google spokesperson explains it, we can now use a variety of different queries and questions to get weather answers, and this is “broadening continually as [Google's] systems get smarter and more comprehensive.”
With that in mind, let’s take a look at the evolution of weather-related searches and how Google handles them.
It all started as a Google engineer’s “20% project” — something Google employees are free to pursue as part of their regular work routine.
Ben Sigelman announced the weather search feature in March 2005 by inviting users to search with simple terms like weather Chicago or “whatever your U.S. location is.” He also said ZIP codes would work, too. A website called Synthstuff.com posted this screenshot a day after Google’s announcement:
A few months later, this same weather search capability launched for mobile phones.
In September 2009, Google’s weather “OneBox” looked almost the same as it did at launch more than four years earlier. Here’s a screenshot from a Search Engine Land article about OneBoxes and direct answers dated September 28, 2009:
But about 10 days later, Google did make a change to its weather search results — it added links to external weather providers like The Weather Channel, Weather Underground and AccuWeather. Here’s a screenshot from Google’s post:
And then a couple months after that, Google started showing weather in Google Suggest. (A feature that doesn’t exist today.)
Google rolled out other weather-related search changes after that — things like interactive weather results on mobile, the addition of weather inside Google Maps and more. But Google started getting serious about weather searches in 2012.
This is when things really started to change. A combination of announcements that weren’t specifically about weather search combined to expand how Google recognizes weather-related searches, and how it returned results for those searches.
Google began supporting natural-language searches in the summer of 2012 in an enhanced version of Google Voice Search that was first available to smartphone users (the iOS version was delayed several months). You could ask Google things like what’s the weather like in san francisco or what will the weather be like this weekend and Google would answer.
Prior to that, Google had already announced the Knowledge Graph and began using card-style displays on Google Now. So in July 2012, weather searches started to have a decidedly new look on smartphones and tablets — with a lot more weather information on display:
On the visual/result side, that pretty much brings us right up to today. Do a weather search on Google desktop, your smartphone or your tablet, and you’re going to see something very similar to that.
But on the query side, Google continues to expand its ability to recognize weather-related searches. Let’s take a look at that.
For years it took the right keywords in your search query to trigger weather results — obvious ones like weather in seattle or 98101 weather. Not so anymore. Consider these examples from searches done on Thursday night:
Those are all desktop searches on Google.com. Another cool thing Google is doing is applying its conversational search feature to weather, which lets you ask a series of questions as if you were talking to another person.
Here’s a series of searches/questions that I asked Google’s iOS app on Thursday night. (You can click to see the larger version.)
As you can see from left to right, on each successive question that I spoke Google recognized that I had started by asking about the weather in Seattle. It carried that through to the second and third search even though I didn’t specifically mention “weather” or “Seattle” again.
Impressive, for sure, but there are still queries that I think are obvious for this weather answer/OneBox — queries that Google isn’t recognizing that way. An obvious one is what’s it like outside, and there are also queries such as is it cold and should i wear shorts tomorrow that don’t produce a weather answer (although should i wear a jacket tomorrow does work).
I’ve obviously focused on English-language searches in the U.S., but Google tells us that the weather answer/OneBox that we see now is also in use in “a wide set of languages” around the world. As for the improvements in recognizing natural-language search, Google says some of these same kinds of queries are available in French, Italian, German, Spanish, Portuguese, Japanese and Korean.
And yes, both Bing and Yahoo also provide weather answers, but neither come close to what Google is doing with natural-language search. When I try some of the searches above on the desktop, like will i need a jacket tomorrow and is it cold out right now, neither one shows me weather details.
All in all, it’s impressive to watch how Google has gone from the most basic of commands (like seattle weather) to now being able to show a series of weather-related responses in conversation-style searches like the mobile example above.
I’m no weather person, but I’d say it’s safe to forecast that this kind of natural-language query analysis is only going to grow in the future.