Skip to main content

How to use heatmaps to strengthen your A/B testing results

confused-faceFew things are as frustrating as planning an A/B test, creating a beautiful variation of your website and driving traffic to it, only to find that there’s almost no difference in conversions between the two variations. Worse still, your variation converts even less visitors than the original design.

The unfortunate truth of A/B testing is that while some tests increase your website’s conversion rate, improve your return on investment and get more from your marketing campaigns, others can leave you feeling nothing but confusion and a sense of wasted time.

But don’t be disheartened. Read on to discover how to transform your ‘failed’ A/B test into an actionable learning experience using heatmap analysis.

Learn to approach a ‘failed test’ as a learning opportunity

Making sense of an unusual split test result can often be difficult. When a variation doesn’t perform as you expected, it’s not always easy to diagnose the source of the problem and how to fix it.

Split testing software can show you which variation performed the best, but it can’t show you why it outperformed the other. A visual representation of the data can often show you more than a chart or table of statistical data. It enables you to see, from your visitors’ perspective, how your website performs.

When your A/B test returns results that are so similar they’re almost impossible to analyse, or they go against what you expected to happen, it’s time to dig into other data sources to learn more.

Gain another level of analysis that isn’t purely statistical

By adding heatmaps to both your control and your variation, you can discover how visitors are actually using your site.

Heatmaps give you instant access to insight on why your variation didn’t perform as expected. From page elements that attract attention but not interest, to poor design, heatmaps reveal how visitors are using your web pages and what opportunities are available to turn your split test loss into the win you were hoping for.

What clickmaps can tell you about your split test

Does click activity differ between variations?

clickmap-heatmapCompare clickmap data between the control and variation to discover how elements perform differently on each page. Are the clusters of clicks on the call to action button on one variation, but dispersed all over the page on the other? Are unimportant elements being clicked on, taking visitors away from the conversion path?

What scrollmaps can tell you about your split test

How far are users scrolling down each page?

scrollmap-heatmapCompare scrollmap data to discover which page is most effective at keeping users engaged and interested. This is especially important on long, content-heavy pages where important elements may be below the fold. Scroll maps provide insight into what percentage of your visitors are scrolling far enough down the page to see your call to action. If a high proportion of your visitors are not scrolling down far enough to see it, they certainly won’t be clicking on it.

What mouse movement maps can tell you about your split test

Which page is the most distracting and difficult to use?

mousemap-heatmapCompare mouse movement maps to discover which page is the most distracting. There is an 86% correlation between mouse and eye movement, making these maps a great indicator of where visitors are looking on your website. Mouse movement maps show you what elements of the page are distracting your visitors from your call to action.

Identifying where your visitors are clicking, scrolling and looking provides you with additional insight into why your variation didn’t perform as you expected and what needs to be changed to address issues.

BONUS TIP: Make sure your variation is an improvement before you A/B test

Heatmaps provide an extra dimension of insight into you split test results, but they can also be used before you run the split test, while you’re designing your variations. You can quickly assess your page variation’s design for visibility using a predictive eye tracking tool like EyeQuant.

EyeQuant is an algorithmic tool that scans your design and tells you which elements are most likely to be focused on by visitors. The more important an element is for conversions, the more prominent and visible it should be.

Page elements like calls to action, price promises, reviews and images can have a huge impact on your conversion rate – if they are visible, that is.

We use EyeQuant during our Creative Execution process to learn how visible important page elements are. This way we make sure that they are getting the attention they deserve, giving our variations the best chance possible of winning.

Next steps

As Warren Buffet says, ‘Some you win, some you learn” and we’re a big believer in this philosophy. So next time you get an unexpected split test result, don’t just push it under the carpet and hope it goes away. Investigate it, talk about it, learn from it.

We’re always eager to hear how you’re using heatmaps to learn from your split tests. Leave a comment below to share your experiences.

Read our ebook below to find out how a focus on three power metrics can double your CRO success.

Is your CRO programme delivering the impact you hoped for?

Benchmark your CRO now for immediate, free report packed with ACTIONABLE insights you and your team can implement today to increase conversion.

Takes only two minutes

If your CRO programme is not delivering the highest ROI of all of your marketing spend, then we should talk.