Lyssna (formerly UsabilityHub) is a user research platform used to test digital products with real users and gain insights into their audience. Its tools and features help Lyssna to optimize users' designs and create more engaging user-friendly experiences. Lyssna is a research platform, offering a broad range of testing features including: Five Second Testing - Used to quickly test the effectiveness of landing pages, messaging and designs by showing users a…
$0
per month per seat
UserTesting
Score 8.1 out of 10
N/A
UserTesting helps UX researchers, designers, product teams, and marketers gather actionable insights through research, testing, and feedback. With a network of real people ready to share their perspectives, UserTesting enables organizations to make customer-first decisions at scale.
N/A
Pricing
Lyssna
UserTesting
Editions & Modules
Free
$0
3 seats included
Starter
$99
per month 5 seats included
Growth
$199
per month 15 seats included
Enterprise
Contact Sales
custom seats
No answers on this topic
Offerings
Pricing Offerings
Lyssna
UserTesting
Free Trial
Yes
Yes
Free/Freemium Version
Yes
No
Premium Consulting/Integration Services
No
No
Entry-level Setup Fee
No setup fee
No setup fee
Additional Details
Discount available for annual plan. Panel responses are priced seperately.
Lyssna is certainly the least expensive, most basic and easy to use out of the range of usability tools I have used in the past. Depending on your maturity as a business and the projects that you are doing, this can be a great starting point before scaling up.
We have evaluated two other platforms - UserZoom and UsabilityHub. We ultimately decided to maintain our relationship with UserTesting due to the overall usability and the functionality that it offers. The features better suited our needs, and it met a price point that worked …
UserTesting is more robust. We also use UsabilityHub, but for different purposes - one off tests that don't require many screens but do require more responses.
UsabilityHub is well suited for remote unmoderated testing. Responses are captured very quickly and live updates allow the user to keep track of how the test is performing. The types of testing that make the most sense to use on UsabilityHub are preference test, first click test, navigational, and design surveys. It is less appropriate for one-on-one testing and lengthy questionnaires.
Well suited to its original purpose- usability testing and interviews. This can be performed at pace, given the large audience (although our brands are very well known so this should not be a barrier) and there is a decent level of task customisation when conducting unmoderated testing. Its less appropriate for survey where you are looking to capture genuine intent/behaviour, even with screeners the data skews more positively than onsite survey, makes me question the quality of survey respondents.
Add additional demographic sorting options for the audience to better meet the needs of B2B users - for example include industry type, functional area, etc.
Quality of participant pool - many are career testers, and many are untruthful. Since sessions are auto-scheduled if the screener is past, you often don't know until they've completed the test. Allow double screening or be more stringent in removing users from the platform.
Unfinished products - focus on making one product the best it can be before moving on to a new one. Unmoderated testing is still missing features (randomization of 3 or more prototypes, etc.)
I'm very happy with my experience of the product and the level of service and learning resources they provide. If the service becomes more expensive than it currently is then we might not be able to justify additional cost - but this is theoretical. I would recommend UserTesting and would ideally renew our contract.
Due to its simplicity and design it is really easy to navigate. You can clearly understand which sections you have completed and which are still left to be done. It is also really easy to change ordering of content etc, which I have found hasn’t been an option in other tools which means it is a really lengthy task of rewriting all of the tasks or questions to get them in the correct order that is desired.
It can be difficult to organize our tests and go back and find information. I think the AI tools are helping and will help with this, but for now it is time consuming to sort through all of the tests and information and then synthesize it and share it with others. It just takes a lot of time.
I have contacted UserTesting's customer service online, by email, or by phone a few times, and each time, I have encountered the same professionalism and expertise. Even in person during a work event, they were there, and it was the same experience.
From a technical perspective, the implementation was extremely smooth. Most of the change management / implementation hurdles were clearing use of the tool through our various security, legal, and information privacy teams. Once these concerns were addressed (UserTesting.com was very helpful in providing all the needed documentation), the implementation process was very simple and we were able to get going right away.
UsabilityHub provides very fast, short responses to specific questions about a static image of a website. This is useful for checking what is most prominent on a page, what they would click on, what they see/read within the first 5 seconds of landing etc. WhatUsersDo is a broader tool, that records the screen and audio as a user navigates the website. You can set tasks and ask questions, but it much more about the user journey experience and their opinion, rather than testing a particular feature. Feedback also takes a bit longer. Hotjar is a combination of both, its a screen recording which helps you to see where users click and move to, but there is no audio or text feedback, just heatmaps/click maps for watching user behaviour.
The quality of the participants: they usually have good feedback and act like "professional" users. Which is good when we want a few insights in a short amount of time. Also, the interface is good. I miss having more features, like a good transcription tool like we have in Condens