User Research: What A/B testing is and how it killed personal opinions
You are launching a new marketing campaign but you struggle. It’s difficult to choose which visual, which claim works best from all those that you like. What usually happens is that in a committee you start a conversation about that. The conversation often transforms itself into a heated debate. Everyone has another idea of the impact each option could have. After hours and hours of unfruitful debates, you finally vote. You select one key visual for the campaign. Ouff! That was hard. Couldn’t that be easier? Of course, you know which is our answer. Yes, it can be pretty simple.
Preference versus action
When committees decide which look the campaign should have you have a problem. It’s a problem because people base their decision on personal preference. They don’t take into account the preferences of their customers. It is not based on data that shows what makes the targeted people, your users or customers, act. You could imagine what your customers like. That’s good. But in fact, we are more interested in what really makes your customer act. If they don’t like an ad, but still click on it, it’s better than a beautiful ad which leads to no click.
What is A/B testing?
A/B testing, or in general split testing, is a method that helps to beat opinion with data. Instead of choosing by yourself which version or variation you should use you test them out.
How to run an A/B testing
The first step is to setup a testing group. To do this you select a sample of your target group. Then you divide your target group by the number of variations you want to test. If we are talking about an A/B testing you have two variations. If you have more than two variations then the technical term is “split testing”.
The second phase is to let people test it out. Let’s say we do an A/B testing of online ads. You want to test out what makes people go to your website. Is it the wonderful price of your product or its features? In your test, you have one ad with the price written in big, and one ad with the features or product benefit explained. The first half of your possible customers sees the ad with the prices. The second half sees the ad with the product benefits.
The third phase is controlling. To make it simple, the ad that generates the more clicks on your websites is the winner. In reality, this is a bit more complex. There are more parameters that go in the optimization of online ads, but that’s for another article.
What can we A/B test?
We can use A/B testing as a method to take out a lot of discussions from the meeting room. With A/B testing you can get data that shows you which direction to take. Here a few examples of what you can find out with a good A/B testing strategy:
- Which color should be your calls to action?
- What copy creates more conversation?
- What works better for women and what works better for men?
- What works better in each region/culture?
- What price are your customers willing to pay for your product?
A/B testing: up to 500% more clicks
Every advertising campaign that Enigma launches includes A/B testing. We don’t stop there, we use also A/B testing to optimize websites. We use it to create innovative services and products.
One example is particularly striking:
Through consistent A/B testing, the performance of a conversion campaign for the Chocolate brand Ragusa For Friends could be improved by over 500% compared to the first version made without the support of Enigma.
A campaign without A/B testing is dumb
When you see the results that we highlighted before the conclusion is clear. Running a marketing campaign without using A/B testing is a missed opportunity. It’s like throwing your media space money out of the window. With a good A/B testing strategy, you can optimize your campaigns. You can optimize them so that without changing the media space you get more results and even lower the costs per result.
Discover more about the power of A/B testing
In the following weeks, we will publish more articles about the power of A/B testing. To get notified of their publication, join our newsletter using the form below.
I want to thank Patrizia Lamprecht for her help in this article. With her experience in A/B testing, she was a great support.