Maze is a rapid user testing platform from Maze.design in Paris, designed to give users actionable user insights, in a matter of hours. The vendor states that with it, users can test remotely, autonomously, and collaboratively.
$75
per month
Optimizely Feature Experimentation
Score 8.3 out of 10
N/A
Optimizely Feature Experimentation unites feature flagging, A/B testing, and built-in collaboration—so marketers can release, experiment, and optimize with confidence in one platform.
Maze User Testing is great if you're interested in doing user research from the comfort of your own desk. You can easily setup usability tests, surveys, card sorting and tree tests among other things to get a better understanding of how customers use your product. The only limitation at the moment with Maze that I can identify is only being able to do unmoderated tests, so if you'd like to be able to ask follow up questions in the moment, Maze is not the tool for you.
Based on my experience with Optimizely Feature Experimentation, I can highlight several scenarios where it excels and a few where it may be less suitable. Well-suited scenarios: - Multi-Channel product launches - Complex A/B testing and feature flag management - Gradual rollout and risk mitigation Less suited scenarios: - Simple A/B tests (their Web Experimentation product is probably better for that) - Non-technical team usage -
It is easy to use any of our product owners, marketers, developers can set up experiments and roll them out with some developer support. So the key thing there is this front end UI easy to use and maybe this will come later, but the new features such as Opal and the analytics or database centric engine is something we're interested in as well.
Would be nice to able to switch variants between say an MVT to a 50:50 if one of the variants is not performing very well quickly and effectively so can still use the standardised report
Interface can feel very bare bones/not very many graphs or visuals, which other providers have to make it a bit more engaging
Doesn't show easily what each variant that is live looks like, so can be hard to remember what is actually being shown in each test
Maze is easy to use most of the times. It is easy to integrate with Figma, It is easy to find testers worldwide with required filters. Maze gives recorded videos which are helpful in debugging and understanding the problem with flows. A/B testing is easy to add and test. Overall Maze is very easy to use
Easy to navigate the UI. Once you know how to use it, it is very easy to run experiments. And when the experiment is setup, the SDK code variables are generated and available for developers to use immediately so they can quickly build the experiment code
A Lookback is an alternative option if you think Maze User Testing is quite expensive for you, but look back has a different approach to Maze User Testing. Lookback focuses on qualitative usability testing instead of quantitative UserTesting. And also, Maze User Testing has a free option but Lookback doesn't have it, but Lookback has a cheaper option at $19/month than Maze.
When Google Optimize goes off we searched for a tool where you can be sure to get a good GA4 implementation and easy to use for IT team and product team. Optimizely Feature Experimentation seems to have a good balance between pricing and capabilities. If you are searching for an experimentation tool and personalization all in one... then maybe these comparison change and Optimizely turns to expensive. In the same way... if you want a server side solution. For us, it will be a challenge in the following years
We have a huge, noteworthy ROI case study of how we did a SaaS onboarding revamp early this year. Our A/B test on a guided setup flow improved activation rates by 20 percent, which translated to over $1.2m in retained ARR.