How to create a great experience for your testers while getting the most out of your Mazes.
#1. Keep the number of missions under 7
Remote testing sessions are very different from face-to-face interviews: testers aren't dedicating 100% of their focus on your product. That's great news, because in real life, your users won't be either.
As testers are giving away some of their free time, their attention will drop significantly the longer your Maze lasts. A good rule of thumb is to try to keep your number of missions under 7, and the average time to complete it under 5 minutes.
Try sharing your draft Maze with a co-worker; track the time it takes him/her to complete it and if it's over 5 minutes, you might need to simplify your Maze.
#2. Start with a simple & straightforward mission
Chances are your testers are very new to the concept of prototyping & user testing. In order to help them, we've added the following 3 point warning at the beginning of the Maze:
✔︎ This is not the final product, only a succession of interactive pages.
✔︎ You will never be asked to type or do a particular gesture, only clicks.
✔︎ If something doesn't respond on click, it's not clickable.
Even with these warnings in place, it's common to see patterns of users' frustration during the first mission (misclicks, longer time spent on pages, bounce...).
A great way to introduce the concept of prototyping (and Maze) to your testers is to start with a simple and straightforward mission: a 3-slide clickable walkthrough for your app has proven to work wonders!
#3. Keep your descriptions under 140 characters
You should think of your mission description as a tweet: it gives a general purpose without going into too much detail. If you find yourself writing more than 140 characters, you're either:
- Giving too much interaction details to the tester ("Go to page X and click on button Y"), which leads to biased results.
- Creating a mission that should be broken down into two separate entities.
#4. Your missions should follow the app's UX flow
Your product has been crafted to be used a certain way, so a great practice is to follow the app's natural UX flow, and avoid jumping from unrelated parts of your product between two missions.
To achieve this effect, try as much as possible to have your missions start with the previous mission's end screen: doing so will help your testers to avoid confusion.
#5. Make sure your testers won't get stuck
This is an obvious one — yet very common: make sure your prototype doesn't have:
- Screens with no way to go back or access other pages of the prototype (no hotspots)
- infinite loops of "Go back" events: two following screens with "go back" events will end up looping from one to the other.
If a tester gets stuck, he is 20% more likely to bounce instead of giving up your missions.
#6. Avoid using your app's lingo
Unless the variable you are testing for is the ability for testers to understand your product's internal language, try using broad, general terms to describe actions.
✅ Do: "Post a new status update!"
❌ Don't: "Send a Wuphf!"
#7. Define each mission's success KPIs with your team
Since this feature isn't (yet 🤫) available on Maze, a great way to make the most out of the collected data is to create a sheet where you can define what you expect for each missions (KPI-wise).
|Missions||Direct success||Avg time||misclick rate|
|Mission 1||> 75%||< 12s||< 20%|
|Mission 2||> 50%||< 5s||0%|
|Mission 3||> 90%||< 9s||< 10%|
After your testing session is complete, compare your expectations to the collected KPIs and see where your design is failing.