Collect in-product feedback, measure customer satisfaction and learn how you can improve. Get ratings, visitor's screen view and sentiment feedback from customers now.
$98
per month
UserTesting
Score 8.1 out of 10
N/A
UserTesting helps UX researchers, designers, product teams, and marketers gather actionable insights through research, testing, and feedback. With a network of real people ready to share their perspectives, UserTesting enables organizations to make customer-first decisions at scale.
Definitely a great tool that is way better than just recording web QA feedback in a Google Doc or via email. Everything goes straight into a queue and it is easy to tag what you are talking about with the tool. Leaves less room for misinterpretation and keeps a record of all feedback so nothing gets missed. Filtering within the feedback queue and search functionality to avoid duplicates would be helpful.
Well suited to its original purpose- usability testing and interviews. This can be performed at pace, given the large audience (although our brands are very well known so this should not be a barrier) and there is a decent level of task customisation when conducting unmoderated testing. Its less appropriate for survey where you are looking to capture genuine intent/behaviour, even with screeners the data skews more positively than onsite survey, makes me question the quality of survey respondents.
When you go into the list of Usersnap feedback you have submitted, there isn't search functionality or filtering so that you can see the feedback of a certain type at a time, or see if you submitted that feedback already.
Quality of participant pool - many are career testers, and many are untruthful. Since sessions are auto-scheduled if the screener is past, you often don't know until they've completed the test. Allow double screening or be more stringent in removing users from the platform.
Unfinished products - focus on making one product the best it can be before moving on to a new one. Unmoderated testing is still missing features (randomization of 3 or more prototypes, etc.)
I'm very happy with my experience of the product and the level of service and learning resources they provide. If the service becomes more expensive than it currently is then we might not be able to justify additional cost - but this is theoretical. I would recommend UserTesting and would ideally renew our contract.
It can be difficult to organize our tests and go back and find information. I think the AI tools are helping and will help with this, but for now it is time consuming to sort through all of the tests and information and then synthesize it and share it with others. It just takes a lot of time.
I am unsure how to rate the support of Usersnap as I did not contact support yet. The tool works well as is. The agency we work with that used the tool didn't need to contact Usersnap support as well. I'm sure the user support on the tool is adequate.
I have contacted UserTesting's customer service online, by email, or by phone a few times, and each time, I have encountered the same professionalism and expertise. Even in person during a work event, they were there, and it was the same experience.
From a technical perspective, the implementation was extremely smooth. Most of the change management / implementation hurdles were clearing use of the tool through our various security, legal, and information privacy teams. Once these concerns were addressed (UserTesting.com was very helpful in providing all the needed documentation), the implementation process was very simple and we were able to get going right away.
Prior to Usersnap we looked at and even tried to bring up Bugzilla, but it requires a lot of maintenance and customization in my opinion. We needed something that was ready to use out of the box, which Usersnap certainly was. The other problem with Bugzilla is that it's mostly for software development bugs, that is, bugs submitted by developers, not really end users. Yes, it can be used by end-users, but not as intuitively as Usersnap.
The quality of the participants: they usually have good feedback and act like "professional" users. Which is good when we want a few insights in a short amount of time. Also, the interface is good. I miss having more features, like a good transcription tool like we have in Condens