Back in December 2012, we published the first article of our SEO for Web Designers series. The fundamentals we covered still form a great basis for the optimization of a website, but because of some changes to Google’s algorithm, I’ve decided to give you some additional advice about the application of these techniques.
Google tweak their search algorithm on a regular basis. In 2013, there were no fewer than fifteen noteworthy changes, in addition to several minor updates.
You’ve probably heard of terms like ‘Google Panda’ and ‘Google Penguin’ before. These are the names given to major search algorithm changes. Panda, first released in 2011, is pretty much a content quality filter that lowers the rank of low quality sites. A year later, we saw the Penguin update, which targeted suspicious links, such as paid links or the use of link networks. Thousands of domains, including popular sites such as Rap Genius and Interflora, were hit by these tweaks. According to Google, Panda impacted 12% of all search results, while Penguin affected a further 3.1%.
Those percentages are dwarfed by the Hummingbird update last year. According to Matt Cutts, it impacts more than 90% of searches worldwide, making it one of the biggest changes to Google’s algorithm in recent years.
The goal of this update is to improve the semantic search capabilities of the search engine; improved understanding of intent and context should produce more relevant results.
Perhaps the best illustration of the possibilities of this update is conversational search. Because of a better understanding of the meaning of words, Google can now answer search queries that have been difficult to solve in the past.
SEO is ever-changing. Several approaches that have been very effective in the past, have become obsolete (which is why search engine optimization is such an interesting field!)
Updates from previous years influence the way we do SEO in 2014. That’s why we need to take following changes into account:
In 2013, Google decided to enable secure searching for all users. Owing to this SSL encryption, keyword data is no longer shared.
You can see this for yourself. If you’re using Google Analytics, take a look at Acquisition > Keywords > Organic. The chances are that the top spot is taken by (not provided), indicating that the visitor used a secure search.
This means that organic traffic from Google can no longer be tracked at a keyword level. Unfortunately, we lose a lot of valuable information about our visitors because of this ‘minor’ change. In 2014, it will be more difficult to optimize your site for relevant keywords.
Saying that, Google wants us to stop focusing on keywords and start thinking about producing quality content. Which brings us to the next change.
The loss of keyword data and the introduction of the Hummingbird update force us to shift from keywords to topics.
As I’ve said before, Hummingbird puts a greater emphasis on context and user intent. Google can now show web pages that answer queries even if the page is not optimized for that query.
This has an impact on the way we’ll do SEO in the future. Instead of optimizing a page for a certain keyword, we’ll need to think outside the box and create content around topics, defining coherent keyword groups that can all be served by the same content.
In August 2013, Google introduced a new type of search result: in-depth articles. These results appear in a separate block on the right side of the SERPs and provide high-quality content to help you learn about a certain subject.
There are several steps we can take to improve the chance of showing up in the in-depth article list. Structured data plays a big role (more about that later).
This update once again shows the importance of quality content. Thin content with little or no extra value has no chance of making it in the in-depth articles block. Combine this with the goal of the Hummingbird update (topics instead of keywords) and it’s clear that Google wants us to create compelling content with a focus on the user’s intent.
For this reason, in the future, we’ll see a lot more ‘big content’, pages that discuss a certain topic in great detail.
One of the best examples of this technique is ‘Brandopolis’, an article from Distilled about content strategy (worth the read if you have some spare time).
Quicksprout use a similar approach for their copywriting guide.
These extensive articles have two big advantages:
Links are still the holy grail of SEO. Search engines see each link to a website as a vote for that website. The more links a page receives, the easier it becomes to rank accurately.
Quantity on its own is unimportant, quality also plays a big role. Link spam (often via automated software) can quickly create hundreds or thousands of links by dropping URLs with optimized anchor text on forums and comment sections. These links are easy to build, but offer little or no extra value to the visitor.
Plenty of websites with suspicious link profiles have already received an unnatural link warning from Google. This penalty limits their visibility in the SERPs and forces them to clean up their link profile.
Good structure helps search engines understand the contents of a page. Unfortunately, HTML markup is too limited in defining the true meaning of various elements, though this has somewhat improved with the introduction of HTML5. There are plenty of new tags (for example
<section>) that we can use to organize a page and help search engines understand the content. At the moment it isn’t completely clear how Google handles these tags, but it’s undoubtedly a good idea to start implementing these tags in projects and consider the semantics of your markup.
Structured data can also be a solution. It helps search engines understand the content of a page and it can also improve visibility in the SERPs via rich snippets (and the new in-depth article box). Examples of rich snippets are video thumbnails, prices or a star ratings. These elements can give the CTR to your page a nice boost.
Another type of rich snippet is authorship markup. Authorship information allows you to link an article to the Google+ profile of the author. Thanks to this link, a small profile image is displayed next to the search result.
Google gives each author an AuthorRank. Popular authors could have their associated content rank higher than unsigned content or the content of other less-authoritative agents. Once again, the quality of the content you produce is paramount.
I probably don’t have to tell you that, under influence of the increase in sales of smartphones and tablets, more and more visitors are using a mobile device to browse the web. This is why a mobile-optimized site is important.
Google recommends a responsive design, for three reasons:
The second bullet point is about loading time. A couple of years ago, Google incorporated site speed in its algorithm. This means that a slow loading page can harm your ranking. So don’t forget to optimize your site for page speed (Yahoo’s YSlow might come in handy).
Although social media can bring in tons of visitors, it still doesn’t have a big influence on your rankings (there’s a correlation between social media and rankings, but no causation). This might change in a couple of years, so it’s still a good idea to start building a social presence. Don’t forget to add Open Graph markup to your pages to optimize your social snippets.
There aren’t any drastic changes in the way we’ll do SEO in 2014. We still need to provide relevant content and a good user experience for our visitors. Nevertheless, there are some additional tips that might help your SEO efforts: