Lookback vs. Optimizely Feature Experimentation vs. UserTesting

Overview
ProductRatingMost Used ByProduct SummaryStarting Price
Lookback
Score 7.7 out of 10
N/A
Lookback is a UX research platform for mobile & desktop moderated and unmoderated research, from the company of the same name in Palo Alto.N/A
Optimizely Feature Experimentation
Score 8.3 out of 10
N/A
Optimizely Feature Experimentation unites feature flagging, A/B testing, and built-in collaboration—so marketers can release, experiment, and optimize with confidence in one platform.N/A
UserTesting
Score 8.4 out of 10
N/A
UserTesting helps UX researchers, designers, product teams, and marketers gather actionable insights through research, testing, and feedback. With a network of real people ready to share their perspectives, UserTesting enables organizations to make customer-first decisions at scale.N/A
Pricing
LookbackOptimizely Feature ExperimentationUserTesting
Editions & Modules
No answers on this topic
No answers on this topic
No answers on this topic
Offerings
Pricing Offerings
LookbackOptimizely Feature ExperimentationUserTesting
Free Trial
NoNoYes
Free/Freemium Version
NoYesNo
Premium Consulting/Integration Services
NoYesNo
Entry-level Setup FeeNo setup feeRequiredNo setup fee
Additional Details
More Pricing Information
Community Pulse
LookbackOptimizely Feature ExperimentationUserTesting
Considered Multiple Products
Lookback

No answer on this topic

Optimizely Feature Experimentation
Chose Optimizely Feature Experimentation
Optimizely FX is the only tool I've used that specifically allows for testing in the back-end. Most front end tools are great for simple tests, but there comes a time when you need to go a level deeper and that's not possible with front-end tools.
UserTesting
Chose UserTesting
UserTesting has a robust panel, and ease of setting up tasks. Lookback was particularly helpful to set up observation sessions and note taking.
Chose UserTesting
I wasn't the person who selected Usertesting, but I did use this in previous company so I was aware of their capabilities. I really enjoy how usertesting applies their research methods and have a greater support. The UserZoom was easy to handle but I don't remember how it was …
Chose UserTesting
I used Lookback at a previous organization. Overall, UT is more usable for participants, and the built-in participant pool is a standout feature.
Chose UserTesting
We evaluated a range of research tools within the UX team, including UserZoom, Lookback, Maze, Optimal Workshop, and UserTesting, and, in the end, concluded that UserTesting had the most comprehensive offer in the market. The only issue we found was that UserTesting appeared …
Best Alternatives
LookbackOptimizely Feature ExperimentationUserTesting
Small Businesses
Smartlook
Smartlook
Score 8.6 out of 10
GitLab
GitLab
Score 8.7 out of 10
Smartlook
Smartlook
Score 8.6 out of 10
Medium-sized Companies
Optimal
Optimal
Score 9.1 out of 10
GitLab
GitLab
Score 8.7 out of 10
Optimal
Optimal
Score 9.1 out of 10
Enterprises
Optimal
Optimal
Score 9.1 out of 10
GitLab
GitLab
Score 8.7 out of 10
Optimal
Optimal
Score 9.1 out of 10
All AlternativesView all alternativesView all alternativesView all alternatives
User Ratings
LookbackOptimizely Feature ExperimentationUserTesting
Likelihood to Recommend
9.0
(2 ratings)
8.3
(48 ratings)
7.6
(189 ratings)
Likelihood to Renew
-
(0 ratings)
4.5
(2 ratings)
7.5
(8 ratings)
Usability
8.0
(1 ratings)
7.7
(27 ratings)
8.0
(167 ratings)
Availability
-
(0 ratings)
-
(0 ratings)
9.1
(1 ratings)
Performance
-
(0 ratings)
-
(0 ratings)
9.1
(1 ratings)
Support Rating
-
(0 ratings)
3.6
(1 ratings)
6.5
(166 ratings)
Implementation Rating
-
(0 ratings)
10.0
(1 ratings)
4.6
(4 ratings)
Configurability
-
(0 ratings)
-
(0 ratings)
7.3
(1 ratings)
Product Scalability
-
(0 ratings)
5.0
(1 ratings)
7.3
(1 ratings)
Vendor post-sale
-
(0 ratings)
-
(0 ratings)
9.1
(1 ratings)
Vendor pre-sale
-
(0 ratings)
-
(0 ratings)
9.1
(1 ratings)
User Testimonials
LookbackOptimizely Feature ExperimentationUserTesting
Likelihood to Recommend
Lookback
Best suited to conduct remote interviews that are moderated and facilitated by the interviewer/researcher.
Not the best if you want to do it unmoderated, there are much more sophisticated tools out there. Unfortunately, for a design research team that does both these kids of research, it can be hard to get budgets to get two softwares and hence the Unmoderated Feature can seem super undercooked and doesn’t really do the job.
Otherwise it’s a great tool
Read full review
Optimizely
Based on my experience with Optimizely Feature Experimentation, I can highlight several scenarios where it excels and a few where it may be less suitable. Well-suited scenarios: - Multi-Channel product launches - Complex A/B testing and feature flag management - Gradual rollout and risk mitigation Less suited scenarios: - Simple A/B tests (their Web Experimentation product is probably better for that) - Non-technical team usage -
Read full review
UserTesting
UserTesting has been great for moderated customer interviews/usability testing as well as for unmoderated testing of messaging, imagery, prototypes and live experiences. I would say that the scope of what you want needs to be limited, as the participants are only paid so much and tests are supposed to not exceed a certain amount of time. For customer interviews, I think it can be difficult to onboard customers to UserTesting if they have never used it before. If I set up interviews, I don't even have them use the UserTesting scheduling tool, I actually set up all the interviews with the customers myself through the tool (being mindful of time zones!). When we run the meeting, they really don't even know UserTesting is involved. Might be nice for UserTesting to allow the upload/connecting to of a Zoom interview and let it do the transcription/analysis from there.
Read full review
Pros
Lookback
  • Organization of user interviews
  • Sharing of interviews across the team
  • Creating highlights of insights
Read full review
Optimizely
  • It is easy to use any of our product owners, marketers, developers can set up experiments and roll them out with some developer support. So the key thing there is this front end UI easy to use and maybe this will come later, but the new features such as Opal and the analytics or database centric engine is something we're interested in as well.
Read full review
UserTesting
  • Product Manager who follows up on your UserTesting usage and gives advice/support when you need it.
  • UserTesting University is a great platform to learn how to use UT and general information about research.
  • UserTesting can find participants quickly, so you won't need to wait long before you can start the analysis.
Read full review
Cons
Lookback
  • Unmoderated interviews is still under cooked as a feature
  • The process of how participants have to download an app to start an interview is a large friction point for us
Read full review
Optimizely
  • Would be nice to able to switch variants between say an MVT to a 50:50 if one of the variants is not performing very well quickly and effectively so can still use the standardised report
  • Interface can feel very bare bones/not very many graphs or visuals, which other providers have to make it a bit more engaging
  • Doesn't show easily what each variant that is live looks like, so can be hard to remember what is actually being shown in each test
Read full review
UserTesting
  • Sometimes there are restrictions around types of research that can be used for moderated user-testing with our own users.
  • For tests on relatively small areas of a website or app, the AI analysis seems rather overblown, like it's trying too hard to come up with something insightful when the test is actually about something quite small (e.g. structure of a mobile app menu).
  • It's difficult to invite our own users to unmoderated user-testing because they wouldn't know how the UserTesting interface works - this is particularly an issue for mobile research.
Read full review
Likelihood to Renew
Lookback
No answers on this topic
Optimizely
Competitive landscape
Read full review
UserTesting
I'm very happy with my experience of the product and the level of service and learning resources they provide. If the service becomes more expensive than it currently is then we might not be able to justify additional cost - but this is theoretical. I would recommend UserTesting and would ideally renew our contract.
Read full review
Usability
Lookback
Once you understand how the interface works, it works great, but there is a learning curve
Read full review
Optimizely
Easy to navigate the UI. Once you know how to use it, it is very easy to run experiments. And when the experiment is setup, the SDK code variables are generated and available for developers to use immediately so they can quickly build the experiment code
Read full review
UserTesting
It's very good, I have used other tools in the past and this is by far the most intuitive and user friendly. Testament to this is the ease with which other non researchers who have been onboarded to the tool with our additional seat have found it easy to use
Read full review
Reliability and Availability
Lookback
No answers on this topic
Optimizely
No answers on this topic
UserTesting
Never encountered any problems
Read full review
Performance
Lookback
No answers on this topic
Optimizely
No answers on this topic
UserTesting
Perfectly fine. Never had any problems.
Read full review
Support Rating
Lookback
No answers on this topic
Optimizely
Support was there but it was pretty slow at most times. Only after escalation was support really given to our teams
Read full review
UserTesting
I have contacted UserTesting's customer service online, by email, or by phone a few times, and each time, I have encountered the same professionalism and expertise. Even in person during a work event, they were there, and it was the same experience.
Read full review
Implementation Rating
Lookback
No answers on this topic
Optimizely
It’s straightforward. Docs are well written and I believe there must be a support. But we haven’t used it
Read full review
UserTesting
From a technical perspective, the implementation was extremely smooth. Most of the change management / implementation hurdles were clearing use of the tool through our various security, legal, and information privacy teams. Once these concerns were addressed (UserTesting.com was very helpful in providing all the needed documentation), the implementation process was very simple and we were able to get going right away.
Read full review
Alternatives Considered
Lookback
Zoom was way more expensive and it o is designed to other things apart from just running qualitative interviews. It also requires a different kind of approval and different approval processes to go through when trying to get it simply for qualitative research purposes.
Lookback records, scribes, helps observe and provides a sentiment check as well in the price that it does
Read full review
Optimizely
When Google Optimize goes off we searched for a tool where you can be sure to get a good GA4 implementation and easy to use for IT team and product team. Optimizely Feature Experimentation seems to have a good balance between pricing and capabilities. If you are searching for an experimentation tool and personalization all in one... then maybe these comparison change and Optimizely turns to expensive. In the same way... if you want a server side solution. For us, it will be a challenge in the following years
Read full review
UserTesting
The quality of the participants: they usually have good feedback and act like "professional" users. Which is good when we want a few insights in a short amount of time. Also, the interface is good. I miss having more features, like a good transcription tool like we have in Condens
Read full review
Scalability
Lookback
No answers on this topic
Optimizely
had troubles with performance for SSR and the React SDK
Read full review
UserTesting
The package we have limits the number of people who can set up tests. This prevents us from scaling the use of the platform.
Read full review
Return on Investment
Lookback
  • It allows us to understand our customers’ problems in a very team compatible way.
Read full review
Optimizely
  • We have a huge, noteworthy ROI case study of how we did a SaaS onboarding revamp early this year. Our A/B test on a guided setup flow improved activation rates by 20 percent, which translated to over $1.2m in retained ARR.
Read full review
UserTesting
  • Content is key to the sign up journey and UserTesting helps me improve that.
  • UserTesting has helped me improve the Cashier on our site, making it easier for users to deposit money.
  • UserTesting is helping me iterate all our content decisions, helping us improve the UX across all platforms.
Read full review
ScreenShots

Optimizely Feature Experimentation Screenshots

Screenshot of Feature Flag Setup. Here users can run flexible A/B and multi-armed bandit tests, as well as:

- Set up a single feature flag to test multiple variations and experiment types
- Enable targeted deliveries and rollouts for more precise experimentation
- Roll back changes quickly when needed to ensure experiment accuracy and reduce risks
- Increase testing flexibility with control over experiment types and delivery methodsScreenshot of Audience Setup. This is used to target specific user segments for personalized experiments, and:

- Create and customize audiences based on user attributes
- Refine audience segments to ensure the right users are included in tests
- Enhance experiment relevance by setting specific conditions for user groupsScreenshot of Experiment Results, supporting the analysis and optimization of experimentation outcomes. Viewers can also:

- examine detailed experiment results, including key metrics like conversion rates and statistical significance
- Compare variations side-by-side to identify winning treatments
- Use advanced filters to segment and drill down into specific audience or test dataScreenshot of a Program Overview. These offer insights into any experimentation program’s performance. It also offers:

- A comprehensive view of the entire experimentation program’s status and progress
- Monitoring for key performance metrics like test velocity, success rates, and overall impact
- Evaluation of the impact of experiments with easy-to-read visualizations and reporting tools
- Performance tracking of experiments over time to guide decision-making and optimize strategiesScreenshot of AI Variable Suggestions. These enhance experimentation with AI-driven insights, and can also help with:

- Generating multiple content variations with AI to speed up experiment design
- Improving test quality with content suggestions
- Increasing experimentation velocity and achieving better outcomes with AI-powered optimizationScreenshot of Schedule Changes, to streamline experimentation. Users can also:

- Set specific times to toggle flags or rules on/off, ensuring precise control
- Schedule traffic allocation percentages for smooth experiment rollouts
- Increase test velocity and confidence by automating progressive changes

UserTesting Screenshots

Screenshot of UserTesting's several solutions for gathering rich customer experience narratives.Screenshot of Interactive Path Flows
Built on recent research in data mining, the Interactive Path Flow aggregates interaction data across multiple participant sessions to visualize the customer journey, surface unexpected behaviors, and locate key moments in the customer journey.Screenshot of Keyword Mapping
Similar keywords are automatically grouped based on overall sentiment (positive, negative, or neutral) to identify themes. Highlight reels associated with each keyword are grouped together, to discover the why behind each sentiment.Screenshot of Video capture and live streaming
Digital and real-world customer experiences are recorded on desktop and mobile devices or live streamed for in-platform viewing.Screenshot of Audience targeting
Audiences are specified by screening contributors from UserTesting’s global network of contributors or connecting to any preferred network.