Are you sure that your current website is the best version it can be? It’s easy to make assumptions about what might be going wrong for your customers, but there is one way to find out for sure. By observing user behavior and using A/B testing, you are providing yourself with invaluable insights about your visitors, which can then inform changes to improve your conversion rate.
What Is A/B Testing and Why Should You Do It?
A/B testing is when you put at least two versions of a web page live at the same time for a designated time period and split the traffic to see which one increases your conversion rate.
In order to A/B test, you first have to start with a hypothesis. This is a theory you have on why users are behaving a certain way based on analytics (and other available user data), and deciding how you could change the page to help improve your conversion rate. This change could be as simple as using different copy, changing the location or color of a call to action (CTA) button, or introducing something new to the page.
Your conversion rate could be based on one or multiple goals, though we would always recommend a small number of goals per test. Some goals, for example, could be: contact form leads, registrations, downloads, newsletter signups, completed purchases, etc.
The tools you use should also allow you to do multivariate testing when you need to test how individual page modifications affect user behavior. Find out more about which test type is the best for your needs.
When you look at your user data and still can’t figure out why your customers are behaving a certain way, testing is the best way to help enhance your site. Testing different page variants helps ensure you make improvements to your conversion rate and don’t ever negatively impact your bottom line.
How A/B Testing Works
When an A/B test is active and a user navigates to that page, they will be shown either the A or B version of the page. A persistent cookie is stored in the user’s browser that tracks if they were given the A/B version of the page, and enables the system to track if the user goes on to complete a conversion in the system.
How to Set Up Your First A/B Test
Once you have your hypothesis and conversion goals ready, you can set up your A/B test. Conversions are desired events or actions on your website that you want users to complete. They can be such things as registering for a service, completing an order, subscribing to a newsletter, or downloading a specific form, etc. Conversions are required for A/B testing as without a desired outcome (which is predetermined), it is impossible to measure the success or failure of an A/B test.
Conversions can be given a value to allow them to be weighted, for example, you may consider registering an account to be a greater conversion than signing up for your newsletter. So you give the account registration conversion a value of 10, and the newsletter signup a value of 5.
Creating A/B Tests
You start with the original variant, which is generally a page already published on your site that you wish to test, this is your “A” variant. You then create a variant of your original page and using the content management capabilities of your CMS, make some changes to the page either around call to action positioning, labelling, or even to the content itself by changing headings or descriptions to something that you feel could be more appealing to your users. This is your B variant.
You will need to set up your A/B testing tool with a start date, end date and specify how much traffic to include in the test. Most tools will allow you to set what the success metric for the A/B test is, i.e., the conversion rate, total conversion value, average conversion value, or total conversion count.
Once you’ve configured your A/B test, you’re ready to launch it.
Analyzing the Results
Reviewing the results of your A/B test should be simple. Ideally, you should be able to monitor the results in real time. You should be able to see how many visitors each variant got, the conversion rate each variant achieved, the total conversion value of each variant, the average conversion value of each variant, and the total conversion count for each variant.
Putting the Winner Live
Once you’re happy that you have reached statistical significance with your results, you can select which variant is the winner.
Remember that the B version will not always win, but we can learn from each and every test and go back to the drawing board with another hypothesis to improve the conversion rate further.
Once you’re finished, move onto another page or funnel on your site and do more A/B testing, or go back and test another hypothesis on the same page. The possibilities for A/B testing are endless.
Never assume you know your user— test, iterate, and test again to confirm what you believe. Conversion rate optimization is the constant pursuit of perfection—it’s never complete.
Kentico Gives YOU the Control
Kentico includes very powerful out-of-the-box testing tools to help you create, manage, and analyze your own tests without any technical knowledge required.
With the help of Kentico, you can set up multiple A/B or multivariate tests, delve into the results, and select a winner to go live—giving you complete control to improve the user experience on your site.
Do you have experience with A/B testing? How did it help you improve your bottom line? Did it affect major or minor design decisions? Have you learned from both your successes and failures? Let us know in the comments section below.
Strata3 has been partners with Kentico CMS since 2012, and became a certified Kentico Gold Partner in 2014. Our Google certified analysts implement, manage, and create insights and metrics, which inform all of our design and development decisions.
What manufacturers should consider when selecting a digital experience platform in 2020
Jun 11, 2020 • 5 minute read