Our optimisation methodology is heavy on process, but we never forget that every sale depends on a human being. The guiding philosophy is that UX (user experience) exists not on the page but in the mind of the user. All our research, analysis and A/B testing is geared towards helping us get into the consumer’s mind.
New clients are surprised by the amount of time and effort we put into research and analysis, and often blown away by what we find out. They tell us it’s the first time they had a particular insight or that they have never understood their customers so well.
These insights prove invaluable, not just to the website but to many other parts of the business. For example, finding a way to express your Value Proposition in a way that persuades more people to buy from you have business benefits beyond the site.
Every one of our recommendations is A/B-tested before it’s made live on your site. It is judged against a predetermined KPI, preferably directly related to revenue. Typically it would be Revenue Per Visitor or Conversion Rate. Winning concepts are those with a statistically high probability of improving this KPI. The end result is a business case for each implementation, reducing risk and helping to use resources wisely.
Director of Multi-Channel
Listen to the full interview
To start the research process, we may install certain intelligence gathering tools on your site. Examples are heatmapping, session recording, live recruiting, onsite surveys and analytics.
One of the first tasks is to plot your user journeys using analytics data, to find out where non buyers are dropping out. This helps to pinpoint the areas to dig deeper into and those to focus on first.
It follows a thorough audit of your analytics tool to make sure it is correctly configured to give us useful data we can trust.
Qualitative and quantitative research are used hand in hand. We use dozens of research techniques, including highly labour intensive methods such as speaking to your call centre and store staff, reading thousands of survey responses, interviewing customers and watching hours of video of your actual users interacting with your site.
Split tests are not just a means to prove a business case for each recommendation, but the outcome of each test is analysed in a way that adds to the bank of data. Nothing is left to chance to get into the mind of the customer.
In time, personas are constructed out of patterns in the data. In our methodology, personas are firmly grounded in research and used in a way that gives a voice to your users throughout the optimisation process.
An effective prioritisation system helps us to assess ideas objectively, and to overcome the psychological biases all humans have. Typically, dozens of areas for improvement are revealed by the research. These are each scored using our own Triage method, based on factors such as the weight of evidence, expected impact on your KPI’s and the ease with which the test could be built and later implemented by the business.
From the prioritisation exercise, ideas are ranked in order of strategic and commercial importance, and this forms the basis of a roadmap.
The infographic below demonstrates how this optimisation plan is a working tool, not a rigid document. It’s fluid, reacting to developing market forces and new insights that may come to light. The decision tree after each test shows that results of one test directly influences what happens next.
When results are promising, we may iterate to try and squeeze out even more. Equally, if a test shows a downturn, we will investigate the reasons and may adjust the hypothesis to come up with a new treatment. Often these are the tests that lead to the most dramatic uplifts, because there is the deepest understanding of the underlying behavioural realities.
Experiments are integral to our optimisation process, and carried out in a meticulously scientific way. Most common is A/B testing or A/B/n testing, but in certain cases we use another type of split test called a multivariate test (MVT). The starting point is a hypothesis that states what change in behaviour we expect to see for each change introduced on the page.
The new web page (or variation) is peer reviewed by other AWA optimisers before the test is built. Results are not simply reported, but presented with recommendations on how to use the learnings to optimise the website further still.
Discover the 38 main factors that are critical to your success.
If you're looking to drive profits and growth or get clarity and insights from your data, start with a free consultation.
or contact us