2014 SEO Playbook: Off-Page Factors

2014-seo-playbookAre you ready for 2014?

Today’s column marks the third and final entry in my annual SEO Playbook. Part 1 primarily focused on what Hummingbird will mean for marketers in 2014, especially as it relates to content and authority. Part 2 took an updated look at on-page SEO factors, including content, HTML and architecture.

In Part 3, I’ll discuss off-page factors that SEOs will need to consider as we enter the new year. Enjoy!

Links: Quality

There is a lot to be said about links. Google is going to continue its trend of getting more discerning and aggressive with penalties in 2014. Quite frankly, their link analysis keeps getting better and the confidence of the search and spam teams — that they will not unjustly targets innocent websites — keeps growing stronger.

One mantra from 2013 is that “link building is dead.” I wouldn’t go that far. There is a lot you can do to encourage links without resorting to artificial means or straight out begging. I see link building programs being folded into influencer marketing programs and becoming more networking oriented. In my opinion, there is nothing wrong with sending out an e-mail notifying your network about your new content, as long as the decision of whether to link or not is ultimately up to them.

Diversity — i.e., links from a variety of sources — is also important. If all of your links are coming from your network or the same websites over and over, you could be in trouble. You cannot put your content on autopilot and check off tick boxes in your editorial calendar. You really need to be actively promoting your content, enough to grow a real audience. When you do this, link diversity tends to take care of itself.

For websites that already have a lot of low quality links out there (and I would absolutely suggest an audit to find this out, especially if you’ve ever used a link building service), you have a difficult choice ahead of you if your site has not already been hit with a Penguin penalty. Do you engage in a link-cleaning program because you fear the future algorithm update may strike your site, or do you do nothing and wait out?

This can be a difficult judgment call, one where professional counsel is in order. I would suggest attempting to remove the most egregious links and engaging in a wider campaign if more than 40% of your links are of low quality. For full disclosure, 40% is not a scientific number; it is a guesstimate. Let’s just say that if I looked at the website and saw that 40% or more of its offsite links were low quality, my skin would start to crawl.

Even if you are not under a penalty, copiously log your link cleanup efforts. Should you be hit with a manual penalty in the future, this log can help to demonstrate that you’ve already made efforts toward rehabilitation and may speed up the reconsideration process.

Another concern is when too many offsite links use the same anchor text. This can occur quite naturally when other sites link to your pages using the article title or title tag. That is generally fine. The real concern comes from unnatural repetition of individual keywords or key phrases. To date, Google shows little interest in grandfathering old links, so be certain to include anchor text as part of your link spam analysis.

As for Google’s link disavowal tool, I would not bother with this unless you are certain the penalty is in place. If you are doing preventative off-site link rehabilitation, another reason to keep a log is so that you can populate the disavow tool quickly if you need to.

Links: Quantity

When it comes to links or domain authority or page authority, the old adage has always been “quantity and quality.” This will not change. If you are not earning new and better links at a faster rate than your keyword competitors, you’ll probably lose many ranking battles.

Links: Paid

There is not much I can say here other than do not purchase links in hopes of better rankings. If you have gone to a search engine optimization conference over the last year, you probably noticed that link sellers are disappearing from the exhibit floor. There is a reason for this, and it is because Google has their number. Just don’t do it.

Trust: Authority

Trust has really started to evolve as a search engine ranking factor, or set of factors. Old signals like domain age are less important, partially because they were never that meaningful to begin with and partially because search engines are able to put more faith in new and better algorithmic signals.

Now, in addition to links from high trust sites like whitehouse.gov or adobe.com, trust is more about things like brand recognition and author recognition. You can be certain that Google and Bing have a database of brands and an automated way to add new ones to the list. Brands are important and get a boost in the rankings — not because you know them, but because people write about them and link to them.

One of the best ways to build trust is to employ author and publisher tags on your content while encouraging your writers to be professionally active in social media. I am also a big fan of inviting or hiring trusted influencers to contribute and write for your company blog. I realize there is an ongoing debate about paid content, so let me be clear: I am not advocating purchasing paid content for the sake of getting stuff onto your website. I am saying seek out recognized experts to write amazing stuff for you and pay them what they are worth.

Another way to build author trust is to have a central author as a voice for your company blog. One person devoting their time to developing great content and promoting it in social media will go a lot further than having round robin contributions from everybody on your staff. Author trust is something to be developed, and as it grows, so does the trust given to all of their past articles.

Trust: Piracy

Is your content management system up-to-date? Most CMS updates include security patches to prevent takeovers and piracy. Do not fall behind.

If your server or website does get hacked or infected with malware taken off-line immediately and put up a 503 page. This lets the search engines know your site is temporarily off-line and will return shortly. If this happens and the search engines blocked your site to protect their users, do not go back online until you solve the problem then filed a reconsideration request. One of the golden rules is to not ask your paid advertising Google or Bing representative for help with nonpaid search. This is probably my one exception to that rule. Whether it will work or not is debatable. But after you fix your website, anything you can do to speed up the reinclusion process is worth doing. Besides, if you were using paid search and your site gets taken down for malware, you want your ads to begin working again once you fix the website.

Social: Connections & Interactions

The reality of social media as a search engine ranking factor has not met the hype created by the search engines and optimization professionals. To be sure, social media is a ranking factor and one that will continue to become more important. That said, social will not replace link authority any time soon, and it appears to be progressing slower than anticipated.

Social media metrics such as Facebook likes and shares or Twitter mentions and retweets have a high correlation with high rankings. But, as the search engine representatives like to remind us, correlation does not equal causation. Right now, this really is a case where popular websites and influencers are as likely to get links as they are social votes.

It is important to understand the relationships that search engines have with social media sites and be active on those sites. For example Google owns Google+. Bing has relationships with Twitter and Facebook. And of course, personalized results will continue to be influenced by social media connection. If a user has a connection to a person or brand, search engines will use social media connections to display relevant content.

Personalization: Country & Locality

International and local search results have been areas of focus for several years now. There are plenty of things to optimize for international and local results, such as proper use of subdomains or country code top-level domains, tagging pages with language codes and registering geographic targets in Google Webmaster tools, as well as registering businesses in Google+ and Bing Places for Businesses. Do not ignore these.

At the same time, it once more comes back to links. If you are getting links from sites related to the geographic locations you’re targeting, your website is more likely to break into local rankings for those places. Factors like IP address and server location will continue to become less influential as search engines get better at measuring user centric signals.

I think this is one area in which social media will eventually play a major role. For example, if many people in Portugal have a company in their Google+ circle,s that company maybe more likely to appear in the search engine result placements inside Portugal. That type of signal is a lot more meaningful than whether the server resides within Portugal or the website is written in the Portuguese language (something multiple countries use).

Personalization: History

Like social media, personal history is a ranking factor that is slowly coming into its own. Right now, if you are logged into Google or use Chrome and visit a web document, that page or site is more likely to show up in future search results. If social media friends visit a page, that document or site is more likely to show up in your future results. Based on personal experience, this is pretty fluid and seems to be one of those things the search engines keep evolving.

Going forward, it makes a lot of sense for search engines to give trust to webpages that lots of people visit, something they can evaluate by using the collected search history data stored in their databases.


Overall, all of the ranking factors boil down to quality, authority and trust. As search engines find new ways to collect data and become better at evaluating the data they already have, it makes a lot of sense that the algorithm will shift from easy-to-measure but less useful signals (like domain age or server location) toward more difficult to measure signals that are more telling (like visitor location and author trust).

For the last two years, we have been seeing that, thanks to Panda and Penguin, the search engines finally have teeth to put behind their policies and guidelines. Search engine optimization is no longer about technical tricks designed to outwit Google and Bing. It is about building an audience, earning trust, and publishing genuinely useful information that people want to consume.

Some call this a new age of search engine optimization. Others say it is the end of SEO and the Golden Age of inbound marketing. One thing is for certain, though: with our current technology, we have more data than ever before to tell us what is working and what is not. Ultimately, the winners are those websites and businesses that can accept the new realities and do something with them.

Obviously, I am a big content proponent because it is the basis for everything from keyword rankings to earning links and attracting influencers. At the same time, one size does not fit all. You must understand and execute what is best for your business given your goals and objectives. Just keep in mind that search engine rankings get earned because of great online marketing programs — and SEO does not create great online marketing.