7 tips to craft the perfect maze
If you're new to Maze, welcome aboard!
Here's seven tips on how to create a great experience for your testers while getting the most out of Maze.
#1. Keep the number of missions under 7
Remote testing sessions are very different from face-to-face interviews: testers aren't dedicating a hundred percent of their focus to your product. That's great news, because in real life your users won't either.
Users give away some of their free time in participating, so their attention will drop significantly the longer your maze lasts. A good rule of thumb is to try to keep your number of missions under seven, and the average time to complete under five minutes.
Try sharing your draft maze with a co-worker; track the time it takes him/her to complete it and if it's over 5 minutes, you might need to simplify your maze.
#2. Start with a simple & straightforward mission
Chances are your testers are very new to the concept of prototyping and user testing. To help them, we've added the following warnings at the beginning of a Maze:
✔︎ This is not the final product, only a succession of interactive pages.
✔︎ You will never be asked to type or do a particular gesture, only clicks.
✔︎ If something doesn't respond on click, it's not clickable.
Even with these warnings in place, it's common to see patterns of user frustration during the first mission (misclicks, longer time spent on pages, bounce etc).
A great way to introduce the concept of prototyping (and Maze) to your testers is to start with a simple and straightforward mission: a 3-slide clickable walkthrough for your app has proven to work wonders!
Check out our article on how to introduce Maze to your testers.
#3. Keep your descriptions under 140 characters
You should think of your mission description as a tweet: it gives a general purpose without going into too much detail. If you find yourself writing more than 140 characters, you're either:
- Giving too much interaction details to the tester ("Go to page X and click on button Y"), which leads to biased results.
- Creating a mission that should be broken down into two separate ones.
#4. Your missions should follow the user flow
Your product has been crafted to be used a certain way, so a great practice is to follow the product's natural user flow, and avoid jumping from unrelated parts of your product between two missions.
To achieve this effect, try as much as possible to have your missions start with the previous mission's end screen: doing so will help to avoid confusion.
#5. Make sure your testers won't get stuck
This is an obvious yet very important one: make sure your prototype doesn't have:
- Screens with no way to go back or access other pages of the prototype (no hotspots)
- Infinite loops of "Go back" events: two following screens with "go back" events will end up looping from one to the other.
If a tester gets stuck, he is 20% more likely to bounce instead of giving up your missions.
#6. Avoid using your product lingo
Unless the variable you're testing for is the ability for testers to understand your product's internal language, use broad, general terms to describe actions.
✅ Do: "Post a new status update!"
❌ Don't: "Send a Wuphf!"
#7. Define mission success KPIs with your team
Since this feature isn't (yet 🤫) available on Maze, a great way to make the most out of the collected data is to create a sheet where you can define what you expect for each missions (KPI-wise).
|Missions||Direct success||Avg time||misclick rate|
|Mission 1||> 75%||< 12s||< 20%|
|Mission 2||> 50%||< 5s||0%|
|Mission 3||> 90%||< 9s||< 10%|
After your testing session is complete, compare your expectations to the collected KPIs and see where your design can be improved.