Title tags are a meaningful on-page ranking factor and are among the most influential factors in determining click-through rate (CTR) from search engine results pages. Higher CTR not only drives more traffic immediately, it also improves organic rankings when combined with good on-site metrics.
Title tag CTR is also difficult to test: it requires a large number of pages that drive a lot of organic visits, and a pretty big chunk of time — usually around two months to give Google time to index all the pages in the test and then collect enough data to have statistically significant results. A tempting alternative is to use PPC ads to get quick answers about what title tag is best.
Unfortunately, testing at Wayfair.com shows that PPC ads are not always an accurate measure of what titles will perform well in the organic results. The reason is that users who click on paid ads are not a random sample of searchers; they are a biased group. We know they behave differently from overall searchers because they are in the minority of users who click on ads.
And it turns out they respond differently to titles than users who click on organic results — specifically, it appears that promotional messaging (“on sale,” “discount,” “free shipping,” “50% off”) performs far better in paid ads than it does in organic results. Titles that performed best in ads often drove organic users away.
The Wayfair SEO team wanted to know if there was a faster way of testing title tag variations that gave accurate answers, so we tested a list of 10 different title tag formulas in three different ways:
Each test used the same 10 title tag variations (except the Mechanical Turk test, which had only seven) and the same control title tag, and the tests ran separately over a three-month period in the summer.
Here is the methodology and findings of our tests.
We used the Mechanical Turk & SERP Turkey method. We tested our title tag variants with two different keywords, and tested them in the #1 position and the #3 position. Our goal was to get 1,000 clicks in each bucket: not large enough to detect very small changes in CTR, but that was fine, because we were looking only for meaningful wins. This test took about a month.
It’s worth noting that it was difficult to get enough survey completions through Mechanical Turk, and we had to pare down our list of title variations to seven so we could get sufficient clicks in each bucket. A meaningful negative of Mechanical Turk is the fairly limited number of title variations that can be tested at once.
Interestingly, the results of this test were actually different based on what position the title tag was ranked:
SERP Turkey pages — basically mockups of search engine results pages (SERPs) that you build yourself — do not include any paid ads above the results, and my hypothesis is the users who might have clicked on those ads are instead clicking on the organic results (particularly the ones at the top), and we’re seeing that paid click behavior in the organic results.
This biased the #1 ranking position greatly and had a smaller bias on rankings farther down the page.
Ultimately, I think SERP Turkey would be a much more valuable platform for this kind of testing if we could also insert fake ads into the fake SERPs, to draw those users out of our test data. That said, this method — using the #3 ranking position and not #1 — correctly predicted the winning title tag; however, it also measured some title tags as performing significantly better than control that actually lost in the live SEO test.
The PPC testing was by far the easiest and fastest way to get results — we had strong results in under a week. We tested our list of 10 variants on a few dozen different keywords, including the same keywords targeted in the SERP Turkey test, with the same control. We ensured only the first line changed between variants and keywords, and over-bid to guarantee visibility.
Unfortunately, the results did not correctly predict the winning title tag — instead, they gave victory to losing title tags. Of note:
It seemed very clear that users who click on paid results are positively influenced by promotional (“hot deals,” “buy it now” and “save save save”) language. There did appear to be some other discrepancies with the live SEO test that were harder to explain, other than to say what’s good for a paid ad title isn’t necessarily good for an organic title, and vice versa.
The live SEO test is the tried-and-true method we were trying to replace. The test took about two months to run (30 days for everything to get indexed, then another three weeks to collect visit data).
We took a large number of product listing pages in similar furniture categories, generally ranking on the first page, and randomly distributed 2,000 URLs each into groups for the variants and a control group. We used the same control and variants as the PPC test, and all of the SERPs associated with the pages had ads present (usually three). Almost all also had the Google Shopping block.
As an example, the groups might look something like this:
Each variant was tested on 2,000 pages, and each 2,000-page group received over 100,000 organic visits each week. We measured the change in organic visits to the test pages vs. the control pages to see how the groups performed, and also spot-checked CTR changes in the Search Console, though these numbers are only relevant for terms with large search volume. This method remains the best way we’ve found of testing title (or meta description) changes.
We saw that groups with promotional language in the title tag all performed worse than the control — sometimes substantially worse. We also saw winners, including some that also did well in the PPC and Mechanical Turk tests.
In our testing, paid ads did not consistently identify winning organic title tags. While trying to improve your title tags is definitely a very smart SEO play, relying on PPC might end up steering you wrong. PPC was able to identify some winners, but also mislabeled losers as winners, particularly when it came to promotional language.
Mechanical Turk fared similar to PPC ads, particularly in the #1 ranking position. I hypothesize that you really need a test set with top ads in them for an effective test. The number of variants you can test is fairly limited due to the lack of huge numbers of people on Mechanical Turk, and it still took a pretty long time to generate results. Mechanical Turk is probably only a viable testing platform for sites that don’t have enough search traffic to their pages to do an on-site test.
Neither method mislabeled winners as losers. If it did badly in either Mechanical Turk or PPC, it also did badly in organic, so using PPC ads or Mechanical Turk can be a good way to at least weed out very bad title variants. But ultimately, neither was a replacement for actual live SEO testing on high-traffic pages.
One piece of general title advice I can give after several rounds of testing over the years is this: It is incredibly difficult to find an e-commerce title tag better than a simple “[category name] | [site]” formula.
So “Mustache Wax | Rand’s ‘Stache” is almost always a winner.
The post Study: PPC cannot accurately identify winning organic titles appeared first on Search Engine Land.