<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1375589045796407&amp;ev=PageView&amp;noscript=1">
CRO Case Study: Does Conversion Rate Optimisation work for B2B sites?

CRO Case Study: Does Conversion Rate Optimisation work for B2B sites?

It’s often said that the double-digit sales increases that Conversion Rate Optimisation brings only works on consumer websites. However we have several B2B clients which prove that’s not the case.

This CRO case study shows how a recent project generated 78% more revenue in just six months for a leading B2B website. This outcome came about by strictly applying the rules of the AWA Conversion System™ and following them step by step.

In all, it took just four split tests to gain that incredible double-digit increase. Read on to find out exactly how it was achieved.

Background

Although our client had previously done some split testing internally, results had been erratic, partly because there was no systemised approach. They were keen to try the scientific approach of Conversion Rate Optimisation.

For this project, we used the AWA Conversion System™, a 5-step process, which, at the time carried a guaranteed sales uplift of 15%.

The five steps are:

  1. Set up – installing tools and conducting in-depth qualitative and quantitative research
  2. Insight generation – analysis of the research and data to produce hypotheses to test
  3. Triage™ - our proprietary analysis to prioritise which split tests to focus on for best results, and which to ignore
  4. Optimisation Plan – a road map detailing exactly what and how to conduct the tests
  5. Execution – creating wireframes, copy and web page designs, running split tests and monitoring the results

84 Insights Generated

First, we installed a range of conversion rate optimisation tools. For this project 15 different tools and techniques were used including:

Quantitative tools & techniques Qualitative tools & research
  • Google Analytics funnel analysis
  • Reviewing popular site search terms
  • ClickTale heatmaps
  • Visitor and customer surveys
  • Interviews with call centre staff
  • Usability testing with 10 members of target visitor groups

These generated 84 insights which would eventually lead us to developing the lucrative split tests. They included:

  • Visitors were frustrated about the level of clicking and browsing required to arrive at the product
  • Most users had difficulty in identifying suitable products among all the options. They frequently looked around the page for clues and means to narrow down options.
  • All users were motivated to increase their basket size to benefit from free shipping.
  • Almost all users were surprised to discover VAT (and sometimes shipping) being added in the final step
  • Customers often reported being unable to find certain products on the website. In some cases those products were not listed. In other cases, site search did not pick them up.

These are all insights which could just as easily have been found on a B2C website. Thus proving that, regardless of who you sell online to, end consumers or corporate clients, understanding their wants and needs is the key to a successful Conversion Rate Optimisation project.

Prioritising the Insights –Triage™ Analysis

The 84 insights generated from stage one were analysed and classified in to one of four categories:

  1. Top Priority: Major conversion killers which should be addressed by split testing
  2. Further Research Required: Insights which require further analysis to strengthen the evidence base
  3. Chunking Required: Insights which, although they have a low impact on their own, could be grouped together to inform a split test or could just be fixed without testing
  4. Low Priority: Insights which don’t require any immediate attention

Triage

The Optimisation Plan – A Roadmap to Success

Triage™ analysis showed us where we needed to focus effort. From the list of Top Priority ‘major conversion killers’, split tests were developed to address the issues. They were detailed in a 90-day optimisation plan.

This showed that the three areas to focus on were, in order, the checkout page, the category pages and finally the homepage. (Without a structured approach, it is often tempting to focus purely on the home page – on both B2B and B2C websites).

B2B Medical Supplier Optimisation Plan

Once the optimisation plan was agreed, we got started on the creative execution process. In just 6 months we delivered four winning split tests, all of which added hundreds of thousands of pounds to the bottom line.

What We Tested

Split Test 1: Checkout Page Delivery Options

Funnel analysis data coupled with visitor feedback, from both the visitor survey and customer survey, left us in no doubt that the checkout page presented us with the biggest opportunity. A number of survey responses mentioned that the cost of delivery was high which helped to explain the large number of exits on the checkout page.

We introduced a two-tier pricing system for delivery. Customers could now choose a lower priced/slower delivery (2-3 days) option or a higher priced/faster (next day) delivery option.

On the checkout, we added a prominent notice urging the customer to

"Get FREE delivery - add £XX to your cart"

Split Test 1 Hypothesis

The research led us to believe that:

  1. Introducing a lower delivery fee option would improve cart completion rate and, consequently, revenue.
  2. Emphasising the free delivery threshold would encourage visitors to purchase more, increasing average order value and revenue

Split Test 1 Results

This checkout page variation with delivery options achieved 29.3% increase in revenue per visitors at a 99.5% confidence level.

This was a very satisfying result, and a clearly indicates the effect that planning and research has, compared with a scattergun approach, or simply trying to adopt ‘best practice’.

But we felt that there were more improvements to be made and proposed a follow-on experiment fine-tuning this one to see what else could be squeezed out of it. We went back to our initial 3 month optimisation plan and revised it to schedule in this follow-on experiment before the second test we had planned. This is one of the great benefits of only planning 3 months in advance, it allows us to be more flexible when things inevitably change and new, high-value, opportunities are identified.

Split Test 2: Fine-Tuning Delivery Options

The intention of the follow-on test was to investigate whether there was more to be wrung out of this first test. Discussions had taken place about whether a one-column design would convert better that the two column format tested in the first A/B test. The purpose of this test was therefore to test this alternate design to monitor the impact on conversion rate and revenue. It wasn't expected to deliver a huge increase.

Split Test 2 Hypothesis

The hypothesis was that this one-column format would look less complex and the simplified design would improve the conversion rate and revenue.

Split Test 2 Results

After seven weeks, the test results showed the new design achieved a 3.3% increase in revenue per visitor. Although this was not statistically significant, the business decided to turn off the test and run with this design as they felt that the results were a good enough indication that the one-column design would be superior.

After this result, we moved onto the second test we had planned on their most visited category page.

Split Test 3: Category Page Design

Feedback from user testing showed that visitors were having difficulty navigating to a suitable product page from the category page. ClickTale heatmap data revealed how visitors looked to the left navigation for guidance, but didn't find it helpful and therefore didn't engage with it. This evidence suggested prospects were having difficulty making sense of the available options which was having a big impact on conversion rate.

We designed a new category page that addressed these problems by:

  • Removing meaningless introduction copy from the top of the page
  • Introducing useful and relevant faceted search filters in the left hand navigation
  • Highlighting popular products at the top of the page
  • Listing products in order of popularity

To minimise the impact of the test, and development time required, this test was limited to the one category page with the highest traffic volumes.

Split Test 3 Hypothesis

The hypothesis was that changing the homepage to be clearer about the value proposition and aligning the main navigation with the most popular conversion paths would improve revenue.

The basis for the value proposition definition came from listening to what top customers told us about why they bought from this website rather than a competitor site. Then we broke each value argument down into specific benefits that would be compelling for website visitors.

Split Test 3 Results

These design changes resulted in a 19.5% increase in the number of customer who added products to their cart at a 98.0% confidence level. This translated to a 5.2% improvement in revenue per visitor from this one category page alone.

Again, we returned to the optimisation plan and added a follow-on experiment to this. This new test involves testing the new design on more category pages to see the overall impact before rolling out site-wide. As the potential wins from this follow-on test are considered to be less than the third test we had scheduled however, this test has been added further down the optimisation plan and will be scheduled in the next update of our 3 month optimisation plan.

Split Test 4: Homepage Design

During user testing it became clear that the global navigation did not match the way users wanted to look for products. This resulted in users having to spend a long time on the site to find the product they wanted, even if they came to the site knowing exactly what they were looking for. Additionally, visitors did not understand the value proposition of the website when looking at the homepage, making them less compelled to purchase from this site as opposed to from a competitor.

Evidence from ClickTale showed that the most prominent area of the homepage received very little attention so it was clear that there was an opportunity to better use this space in a way which matched user’s needs.

 

 

ClickTale heatmap of the category page shows where customers really click and what they ignore. ClickTale heatmap of the category page shows where customers really click and what they ignore.

Split Test 4 Hypothesis

We hypothesised that changing the homepage to be clearer about the value proposition and aligning the main navigation with the most popular conversion paths would improve revenue. To support this, our new design featured:

  • A more compelling headline
  • New, more persuasive, introductory copy
  • A new top navigation showcasing the top product categories
  • A reduced number of items listed in order of popularity

Split Test 4 Results

The results were a 25.2% increase in revenue per visitor at a 99.9% confidence level. Further iterations of this test will focus on the value proposition elements, testing different values to establish which ones have the biggest impact.

The Overall Result

In isolation, these AB tests generated big wins for the client. But the real value of the project is when you view them altogether. Combined, they have resulted in a 78% increase in revenue per visitor. Something very few companies, from either the B2B or B2C sector, would turn their noses up at.

We proved that, not only is it possible to successfully use the scientific approach of Conversion Rate Optimisation to increase sales on a B2B website, but that results can be seen quickly. We achieved all of this in just 6 months.

Ready to get results like these for your business?

New Call-to-action

 

Posted in: CRO Case Studies

 
 

Get a Free Eyetrack

A visual map of your landing page

Yes Please show me what
my visitors look at

Thanks - but no thanks