A/B testing, also known as split testing, is a common method used by marketers to compare two variations of a thing, such as a landing page or an email subject line, with the goal of increasing conversion or open rates, respectively.

Split testing can also be performed earlier in the process to learn if a design will be effective with users once implemented. In this article, we’re going to take a look at how Maze allows you to split test your design in a few steps.

What is A/B testing?

A/B testing is the process of testing two (A vs. B) or more variations to find out which one performs better. Applied to design, split testing is used with the purpose of validating a design hypothesis early in the product development process.

To get started, you'll need to create the first version of a design, also called the Control version. By changing one variable in the Control, such as the placement of a subscribe box field or the call-to-action (CTA), you’ll create the second version, also known as the Variation.

One of the most important things with split testing is to determine beforehand the metric you’ll measure success by when testing is complete. There are many insights you could extract from an A/B test, and knowing what it is you’re measuring is essential.

How to A/B test your design with Maze 💁

With Maze, you can A/B test your prototype very easily. We currently support InVision, Marvel, and Sketch prototypes. By importing the first version of your prototype to Maze, you'll create the Control test.

💡 Tip: Learn more about creating your first Maze.

To create the Variation, go back to your prototyping tool and make changes to the same prototype - no need to create a new one. We recommend changing just one single variable so that the testing results are accurate and reflective of that change. Here are some ideas on what you can test:

  • Text: titles, descriptions, or CTAs
  • Visual elements: images, icons, colors
  • User flow: the path taken by the user to move through the prototype

Once you're done making the changes, go back to Maze and click on the Import new prototype version button inside your project. This will create a new Maze with the changes you've made to your prototype.

ezgif.com-gif-maker--3-

Finish by defining missions for both Mazes: write titles and descriptions, and lay out the expected paths you expect users to take. Once you’re done, set the Mazes live and share the link with your testers.

💡 Tip: Avoid testing with the same participants. For the best results, randomize the selection of testers for each test version.

Analyze results and implement the winner 📊

As mentioned above, it’s important to determine beforehand the metric you’ll be measuring success by. With Maze, you get high-level results for each mission, such as success rate, misclicks rate, and time spent on screen, as well as metrics on individual sessions such as duration of test or clicks on a page made by a tester.

Before creating your A/B test, establish the success metric valuable for your product.

For instance, when testing user flows, you can compare the success rate of each mission for both Mazes and also look at the paths users have taken the most. When testing CTAs, misclick rate and time spent on screen are important metrics to indicate performance.

Once testing is done, compare the KPIs of the Control Maze to the KPIs of the Variation. Naturally, the design that performed better is the winner and you'll have a clear understanding of what to implement.

💡 Tip: Consult our short guide to help you read the results on your dashboard.

The takeaway

Evidence-based design decisions take into account real user preferences. By split testing your design at the prototype phase, you will create a product that incorporates your customers' preferences.

More so, you'll prevent many costly back-and-forths between the design and development teams, and focus on bringing a valuable product to market.

blog-footer-2