WhichTestWon’s third annual ‘State of Online Testing’ report was released last week and it makes for some interesting reading. Clearly, the testing industry has now come of age, and across the globe, more and more online businesses are seeing the value of split testing. But it’s not all good news. Some responses appear to suggest a worrying gap between how split tests should be run and how they are actually run.
Below we’ve highlighted our 8 key takeaways and provided a dozen bonus facts and figures.
Mobile websites are changing not only the way we shop online, but also where businesses focus their split tests. Nearly half of all respondents to the survey (46%) are running split tests on their mobile site. This figure is up from 30% of respondents in 2013 and 39% last year. In addition, 13% of respondents are testing mobile apps – a percentage that is growing year on year.
… But Australia and New Zealand aren’t far behind.
As has been the case since the report began, respondents in the UK & Europe are the most likely to be running split tests with 73% of respondents conducting tests in 2015. Respondents in the Australia & New Zealand region saw the largest year on year increase in testing jumping from 56% last year to 69% this.
Large businesses are the most likely to employ a UX and testing team. But now, more than ever before, small businesses are going for in-house testers. There’s been a 1300% increase in small companies testing in-house, up from only 1% last year to 14% in 2015. That’s nearly as many as large businesses (15%) and way more than mid-sized businesses (10%).
Website analytics remains the most-used research tool used to construct split test hypotheses, followed by best practice, ideas from other sites and conversion quality. “Our ‘gut’” comes in the middle of the list and its use has decreased from 64% in 2013 to just over half (53%) in 2015.
That’s good news as, in our experience, ‘gut feel’ is the least likely method to result in high sales uplifts. We’d still like to see some of the lesser used tools – usability studies, visitor surveys and eyetracking heatmaps – appear higher on the list as it’s insights from these that can skyrocket the success of your split test.
Every year since 2013, fewer testers have reported measuring “just” conversions. The percentage measuring immediate clicks has remained relatively static as more adopt conversion ‘quality’ metrics to measure their success. These metrics include Revenue Per Visitor (our metric of choice), Lifetime Customer Value and Average Order Value – all of which have a direct impact on the bottom line and represent what really matter to successful businesses.
18% of testers are still stopping their split test as soon as the tool reports conclusiveness. Although this is down from 25% last year, it’s still a number we, as an industry, need to eradicate. The danger is that a “winning variation” goes live on the website and bombs, losing money, damaging the brand or hurting the business.
Even more worrying, 46% of marketers and 37% of vendors don’t know that industry best practice commands that a test shouldn’t be declared a winner until it has reached a confidence level of 95% or higher.
These figures may represent a wave of rookie testers entering the midst and indicates the need for greater education and training for people running split tests to inform business decisions.
Marketers and organisations are still finding reasons not to test.
At a company level, lack of time is the most prominent reason (23%) followed by lack of resource (17%). Although, perhaps unsurprisingly given the increase in hiring, this latter figure is down from 27% last year. So business intent is there.
For marketers, it’s a similar picture. The majority of respondents (28%) say they “plan to test. It’s just not implemented yet”. Lack of time was less of a concern year on year, but we are worried to see 8% of respondents reply that “management doesn’t support testing”.
Interested in talking to a web conversion agency to help you get the most out of your split testing? Please contact us or give us a call on 020 7887 2695.
With this report, WhichTestWon aim to provide independent, non-biased data on split testing and Conversion Rate Optimisation. The data for this survey was collected via a 32-question survey with responses collected between August 14-24 2015. In total, 685 respondents took the survey which was distributed to WhichTestWon’s readers, members and followers and the wider marketing world via email and social media.
Sign up to our newsletter and get all of the latest news straight to you.
If you’re serious about initiating change within your business, we’d like to offer you a 60-minute Initial Strategic Review.
“We’ll share what we’ve learned from decades of experience working with businesses using optimisation, innovation and experimentation to achieve business goals like yours”