A wise person once said that if you think search engine optimization (SEO) is expensive, you should see how expensive cheap SEO gets.
There are few shortcuts when it comes to sustainable search engine optimization. As with any other professional service, quality SEO comes at a price — and premium services are more likely to deliver desired results. But regardless of your online business’ size and marketing budget, there are many ways to save SEO resources.
Few site owners can tell how many pages of their site Google has to crawl in order to index and eventually rank a single page. Let’s assume the ratio is close to ten to one.
For many large sites, the actual ratio is closer to one hundred to one or higher. That means search engine bots have to find and crawl more than a hundred pages of a site before they can rank one — if it ends up ranking at all.
If the crawl-to-useful-content ratio tends to be unfavorable, chances are that the Googlebot is not exclusively crawling pages designed to perform best.
Using the meta noindex tag on pages with little or no content — pages that make a site a potential Panda algorithm candidate or pages that do not perform as expected — is a seasoned technique to help direct major search engine bots to crawl pages that matter (and crawl them more frequently).
There is no reason to apply the rel=”nofollow” attribute on a website’s internal links or on links pointing to associated social media channels such as Facebook, Twitter or Google+.
The rel nofollow attribute — for a short time believed to be the “magic bullet” for PageRank sculpting — indicates to search engines just one thing: There’s no reason to trust the page the link is pointing to.
Despite countless speculations in that regard, there’s no hard evidence that selectively applying rel=”nofollow” to links on your page causes PageRank to be preserved and distributed among other links on the same page.
Unless there are user-generated outgoing links pointing to unverified sites, it’s best to abandon using the rel=”nofollow” attribute once and for all.
Minimizing page size — and thereby improving load time for search engine crawlers and user alike — is a top priority. Remember, site speed has been a ranking factor for quite some time now.
While there are several ways to reduce page size, a good place to start is by getting rid of page elements that are remnants of a past long gone. Meta keywords, for example, are obsolete. Major search engines have been ignoring them for years.
Using meta keywords serves no other purpose than providing an insight to competitors for which commercial terms a page is meant to rank. It is high time to remove meta keywords.
When it comes to evaluating backlinks — for example, while assessing risks associated with link building — complete data is essential. The sample provided in Google Search Console (aka Google Webmaster Tools) is limited, but it is being updated on a regular basis.
There’s no guarantee how often that happens, but a few times per week appears to be the average frequency for most sites.
That link data is not only free of charge, it is also temporary — unless you constantly download and preserve it. It may seem like a laborious effort, but documenting your links over time will prove extremely useful for ongoing backlink profile maintenance, especially if you wind up dealing with a manual spam action from Google.
Having that backlink data available when it’s urgently needed will make it much easier to initiate recovery efforts in the event of a Google penalty. If you don’t have this information on hand, you’ll have to spend time crawling your entire backlink portfolio — which, depending on volume, may take up to several weeks in order to build a sample complete enough.
Not downloading the backlink sample provided in Google Search Console is a missed opportunity. There are no good reasons not to download and save the backlink sample on a daily basis.
Thorough documentation of all ongoing website updates, technical or otherwise, offers the potential to soften any rollbacks if they become a necessity. Documentation is required with any link-building activity, whether in-house or conducted by trusted business associates. In fact, any ongoing link-building activities require two important steps:
Both steps can be part of an amendment to an existing contract and will help foster mutual trust in the quality of ongoing link-building initiatives. They can also significantly lower the cost of a backlink audit and speed up the reconsideration request process if it should become necessary.
The value of server logs is frequently underestimated and sometimes ignored altogether, despite its tremendous potential for audit evaluation (and as a bargaining chip driving up the sale price tag for a successful website).
While initially associated with some nominal cost, the potential benefits of collecting and using log data to understand Googlebot and user behaviour alike is unparalleled. There are a number of effective tools aiding log analysis, including (but not limited to) Botify Log Analyzer, Logentries, Logsearch, Logz and Splunk. All of these can be greatly supplemented by Google BigQuery.
Lastly, one great way of avoiding unnecessary SEO cost is making well-informed decisions and avoiding bogus or dishonest (often automated) search engine optimization services that at best have no impact at all and at worst can jeopardize a site’s reputation with search engines.
Merchant platforms show countless suspicious SEO service offers, not all of which are easily recognized for what they are. That is why continuously following established information sources (such as Google Webmaster Guidelines), as well as leading SEO industry authorities (like Google’s John Mueller) is time well spent.
What are your favorite cost-saving methods when it comes to SEO? Your opinion matters. Please share your thoughts and suggestions in the comments down below.