Split Testing: Is it really just about A/B testing button colours?
Have you even found yourself looking at an e-commerce website and thinking “Eugh, I’m not buying from this website – the button is green!”?
No, me neither. Yet when I say the words ‘A/B testing’, nine times out of 10 people think it’s all about testing different button colours. Heck, I even use button colours as an example when trying to explain my job to my not-so-tech-savvy family!
The fact of the matter is that A/B testing can be used to test different button colours on your website, and testing different button colours can make a difference to the effectiveness of your website. Whilst researching for this article, I found examples of different button colour increasing click through rate by 21%, 36% and even as much as 88%. But these case studies are full of conflicting advice. For every one that says red wins, there’s an example that says that green wins, or yellow, or blue. Confused? So are we!
A/B testing – The power tool in your Conversion Rate Optimisation toolkit
But just because A/B testing can be used to test button colour, don’t mean that’s what it should be used for. A cordless drill can be used as a paperweight but that doesn’t make the best use of the power tool.
A/B testing is a powerful Conversion Rate Optimisation tool which has the potential to deliver big wins to your business. Here we take a look at how you can get the most from A/B testing on your website
Identify your biggest opportunities
Believe it or not, the success of any split test you run is determined before you’ve even opened your A/B testing tool and created your variation. Conversion Rate Optimisation is a process, and the key to this process lies in understanding your visitor. As I said before, no visitor to your e-commerce website is put off from buying from your site just because of the colour of your ‘Add to basket’ button – there are other factors at play.
Successful Conversion Rate Optimisation programmes use a number of tools to gather and understand what these factors are. This exercise usually results in a large number of opportunities for you to address. Prioritise these opportunities according to impact and strength of evidence and you’ll soon see which areas of your website you should be concentrating on first.
Of course, if a large proportion of visitors mention that they didn’t buy because they couldn’t find your call to action button, then an A/B test using a more prominent button colour should be high up your agenda.
But I’ve yet to see an e-commerce website where this was the main conversion killer. In our experience, the double-digit increases in revenue have been found by understanding visitor needs and making changes which address them. For example, this methodology has been used to increase revenue per visitor 34.7% by A/B testing a homepage and 29.3% by introducing a cheaper delivery option to the checkout. These improvements have been delivered straight to bottom line, they’re not just higher click through rates.
Stop A/B testing button colours and make your A/B testing a success
Talking to and understanding your visitors and customers is the key step when determining what to A/B test. So, although changing button colours is something you could test, we wouldn’t recommend it without definitive evidence that this is losing your business money. Introduce a process of listening to your visitors and your A/B tests are guaranteed to make meaningful improvements to your bottom line.
If you’d like to learn more about split testing, and need expert help, read our ebook below for 8 questions you must ask to find, hire and get great results from CRO professionals.
People from Facebook, FarFetch and RS Components receive our newsletter. You can too. Subscribe now.
Interested in turning experimentation and testing into an advantage for your entire business?