A/B testing is a form of quantitative analysis comparing two live versions of a site, application, or email message. It attempts to make focused changes that produce a statistically significant difference in some well-defined user action.
A/B testing (also known as split testing) is a method of comparing two versions of a web page, product, email, or system, and seeing which performs better. By creating and testing an A and B version, you can try new design changes, test hypotheses, and improve your user’s responses.
The goal of a split test is to look at differences in the behavior of two groups and measure the impact of each version on an actionable metric.
You can A/B Test Headlines, Sub headlines, Paragraph Text, Testimonials, Call to Action text, Call to Action Button, Links, Images, Content near the fold, Social proof, Media mentions, Awards, and badges.
Your analytics will often provide insight into where you can begin optimizing. It helps to begin with high traffic areas of your site or app, as that will allow you to gather data faster. Look for pages with low conversion rates or high drop-off rates that can be improved.
Your conversion goals are the metrics that you are using to determine whether or not the variation is more successful than the original version. Goals can be anything from clicking a button or link to product purchases and e-mail signups.
Once you've identified a goal you can begin generating A/B testing ideas and hypotheses for why you think they will be better than the current version. Once you have a list of ideas, prioritize them in terms of expected impact and difficulty of implementation.
Using your A/B testing software (like Optimizely), make the desired changes to an element of your website or mobile app experience. This might be changing the color of a button, swapping the order of elements on the page, hiding navigation elements, or something entirely custom. Many leading A/B testing tools have a visual editor that will make these changes easy. Make sure to QA your experiment to make sure it works as expected.
Kick off your experiment and wait for visitors to participate! At this point, visitors to your site or app will be randomly assigned to either the control or variation of your experience. Their interaction with each experience is measured, counted, and compared to determine how each performs.
Once your experiment is complete, it's time to analyze the results. Your A/B testing software will present the data from the experiment and show you the difference between how the two versions of your page performed, and whether there is a statistically significant difference.
UX process should be customized consistently depending on the designer’s experience and the project’s needs to achieve the desired goals.
“With the passage of time, the psychology of people stays the same, but the tools and objects in the world change.”
Donald A. Norman, The Design of Everyday Things.