<img src="http://www.44-trk-srv.com/82663.png" style="display:none;">
Conversion Rate Optimisation Article

24 September 2015

WhichTestWon’s third annual ‘State of Online Testing’ report was released last week and it makes for some interesting reading. Clearly, the testing industry has now come of age, and across the globe, more and more online businesses are seeing the value of split testing. But it’s not all good news. Some responses appear to suggest a worrying gap between how split tests should be run and how they are actually run.

Below we’ve highlighted our 8 key takeaways and provided a dozen bonus facts and figures.

1. Mobile testing is growing at an unprecedented level

Mobile websites are changing not only the way we shop online, but also where businesses focus their split tests. Nearly half of all respondents to the survey (46%) are running split tests on their mobile site. This figure is up from 30% of respondents in 2013 and 39% last year. In addition, 13% of respondents are testing mobile apps – a percentage that is growing year on year.

WTW-SoOT-1

2. The tests that work best depend on what type of business you’re in

If you have an ecommerce website:

  • landing page tests were the most likely to have a positive impact
  • tests optimising add to cart elements generated the highest impact
  • security icon tests were the most likely to have no/negative impact

WTW-SoOT-2a

If you have a lead generation website

  • landing page tests generated the highest impact
  • copy tests and homepage tests were the second and third most likely to have a positive impact
  • overlay tests were the most likely to have no/negative impact

WTW-SoOT-2b

If you’re an engagment marketer

  • landing page tests generated the highest impact
  • navigation tests were the most likely to have a positive impact, although the impact was classed as being more ‘moderate’ than ‘high’ compared to 2014
  • social media icon tests were the most likely to have no/negative impact for the second year in a row

WTW-SoOT-2c

3. Marketers in the UK & Europe are most likely to be testing…

… But Australia and New Zealand aren’t far behind.

As has been the case since the report began, respondents in the UK & Europe are the most likely to be running split tests with 73% of respondents conducting tests in 2015. Respondents in the Australia & New Zealand region saw the largest year on year increase in testing jumping from 56% last year to 69% this.

WTW-SoOT-3

4. Looking to make a career in testing? Now small businesses want you too.

Large businesses are the most likely to employ a UX and testing team. But now, more than ever before, small businesses are going for in-house testers. There’s been a 1300% increase in small companies testing in-house, up from only 1% last year to 14% in 2015. That’s nearly as many as large businesses (15%) and way more than mid-sized businesses (10%).

WTW-SoOT-4

5. Hypotheses are less likely to be constructed based on “gut feel”

Website analytics remains the most-used research tool used to construct split test hypotheses, followed by best practice, ideas from other sites and conversion quality. “Our ‘gut’” comes in the middle of the list and its use has decreased from 64% in 2013 to just over half (53%) in 2015.

That’s good news as, in our experience, ‘gut feel’ is the least likely method to result in high sales uplifts. We’d still like to see some of the lesser used tools – usability studies, visitor surveys and eyetracking heatmaps – appear higher on the list as it’s insights from these that can skyrocket the success of your split test.

WTW-SoOT-5

6. Success is being measured by more metrics – not just conversion.

Every year since 2013, fewer testers have reported measuring “just” conversions. The percentage measuring immediate clicks has remained relatively static as more adopt conversion ‘quality’ metrics to measure their success. These metrics include Revenue Per Visitor (our metric of choice), Lifetime Customer Value and Average Order Value – all of which have a direct impact on the bottom line and represent what really matter to successful businesses.

WTW-SoOT-6

7. A worrying percentage of testers don’t follow best practice when it comes to test duration and statistical significance

18% of testers are still stopping their split test as soon as the tool reports conclusiveness. Although this is down from 25% last year, it’s still a number we, as an industry, need to eradicate. The danger is that a "winning variation" goes live on the website and bombs, losing money, damaging the brand or hurting the business.

WTW-SoOT-7a

Even more worrying, 46% of marketers and 37% of vendors don’t know that industry best practice commands that a test shouldn’t be declared a winner until it has reached a confidence level of 95% or higher.

These figures may represent a wave of rookie testers entering the midst and indicates the need for greater education and training for people running split tests to inform business decisions.

WTW-SoOT-7b

8. We’d like to test, but…

Marketers and organisations are still finding reasons not to test.

At a company level, lack of time is the most prominent reason (23%) followed by lack of resource (17%). Although, perhaps unsurprisingly given the increase in hiring, this latter figure is down from 27% last year. So business intent is there.

WTW-SoOT-8b

For marketers, it’s a similar picture. The majority of respondents (28%) say they “plan to test. It’s just not implemented yet”. Lack of time was less of a concern year on year, but we are worried to see 8% of respondents reply that “management doesn’t support testing”.

WTW-SoOT-8a

BONUS: 12 fast facts and figures

  1. 94% of this year’s respondents stated that split testing delivered big or moderate results for their business.
  2. From 2013 to 2015, the number of testers conducting mobile app studies increased by 86%
  3. B2B testing is becoming more popular, growing 22% in just 3 years.
  4. Currently, 80% of large companies report running split tests in comparison to 74% of mid-sized companies and 58% of small companies.
  5. Testing staff are in demand as their recruitment is growing. This means more businesses are committing to testing and makes recruits with this skillset increasingly more valued in the corporate world.
  6. Although marketing is the most likely department to be in charge of managing test, their prominence is decreasing in large and mid-sized companies with web analytics, usability and design & development teams now picking up responsibility.
  7. Increasing numbers of lead gen marketers are now using split testing to increase leads although online sales marketers remain the most likely to test.
  8. In 2015, 63% of online sales marketers said that testing was “significantly worth it” compared to 56% of lead gen marketers and 52% of engagement & brand awareness marketers.
  9. Testers are increasingly running more sophisticated tests with 66% of respondents running segmentation tests, 55% running multivariate tests and 54% running concurrent tests.
  10. Personalisation tests are increasing in popularity with 41% of this year’s respondents reporting having run this type of test.
  11. The majority of respondents (67%) said that their business grew this year although the outlook is more reserved than last year.
  12. 62% of organisations plan to hire, or outsource, split testers within the next 12 months.

Interested in finding out more about the state of online testing? Download the full report here.

Interested in talking to a web conversion agency to help you get the most out of your split testing? Please contact us or give us a call on +44 (0)20 7887 2695.

About the report

With this report, WhichTestWon aim to provide independent, non-biased data on split testing and Conversion Rate Optimisation. The data for this survey was collected via a 32-question survey with responses collected between August 14-24 2015. In total, 685 respondents took the survey which was distributed to WhichTestWon’s readers, members and followers and the wider marketing world via email and social media.

Posted in: Conversion Rate Optimisation, A/B Testing, Multi-Variate Testing (MVT)