Lyssna (formerly UsabilityHub) is a user research platform used to test digital products with real users and gain insights into their audience. Its tools and features help Lyssna to optimize users' designs and create more engaging user-friendly experiences. Lyssna is a research platform, offering a broad range of testing features including: Five Second Testing - Used to quickly test the effectiveness of landing pages, messaging and designs by showing users a…
$0
per month per seat
Pricing
Crazy Egg
Lyssna
Editions & Modules
Crazy Egg
$24.00
per month
Free
$0
3 seats included
Starter
$99
per month 5 seats included
Growth
$199
per month 15 seats included
Enterprise
Contact Sales
custom seats
Offerings
Pricing Offerings
Crazy Egg
Lyssna
Free Trial
No
Yes
Free/Freemium Version
No
Yes
Premium Consulting/Integration Services
No
No
Entry-level Setup Fee
No setup fee
No setup fee
Additional Details
—
Discount available for annual plan. Panel responses are priced seperately.
Maze has a more comprehensive reporting presentation compared to Usabilityhub. Maze's interface is clean and modern but it lacks a simple intuitive testing set found in UsabilityHub. The terminology of a Maze is slightly confusing, flow tests are integrated with only Figma, Xd, …
+ I strongly believe that this tool helps when a firm has good user count (depends on business model) as most of these tools are data friends. More data - more valuable insights+ Best fit if someone who is looking for deeper insights of individual page - Not suggested for very fewer visits of a website. Suggested toimprove better visit count
UsabilityHub is well suited for remote unmoderated testing. Responses are captured very quickly and live updates allow the user to keep track of how the test is performing. The types of testing that make the most sense to use on UsabilityHub are preference test, first click test, navigational, and design surveys. It is less appropriate for one-on-one testing and lengthy questionnaires.
Provides heatmaps that shows you the elements on your site that are and aren't performing well.
Provides scrollmaps so you can see how far down a page users are scrolling and which content never gets seen.
Screenshots show you how your website looks across a variety of different devices.
Provides a type of clickmap called confetti that enables you visualise clicks by segments - device, new/returning visitors, campaigns and other metrics.
The largest thing we've struggled with is the Optimizely integration. I've contacted customer service a few times to get it properly setup. Customer Service is always friendly and helpful; they provide clear steps to get it setup. Unfortunately despite clear instructions, they are tedious, and if not completed in the correct order, the integration with Optimizely does not work. My success rate with the integration is less than 55%.
Add additional demographic sorting options for the audience to better meet the needs of B2B users - for example include industry type, functional area, etc.
It's a great tool considering how inexpensive it is. If used correctly and you have a plan for tracking your websites, this tool can make a world of a difference. If you are not going to sit down and take the time to make a plan for how to use this tool, I would say it is not worth your time. Yes, you can look at items on your website that need to be changed, but without a consistent plan, other important items that need changing can be lost in the mix. Make sure you have enough time and energy to invest in this and it will be well worth it
Crazy Egg is extremely easy to set up and use, and very well done from a user experience standpoint. It is really helpful that I can give stakeholders access to the interface and get them interacting with it with minimal training. The A/B testing is the easiest I have ever used, with minimal performance impact to the website.
Due to its simplicity and design it is really easy to navigate. You can clearly understand which sections you have completed and which are still left to be done. It is also really easy to change ordering of content etc, which I have found hasn’t been an option in other tools which means it is a really lengthy task of rewriting all of the tasks or questions to get them in the correct order that is desired.
It's slow to post data, and slow to get a snapshot to finally be active (i.e. not pending). Not intolerable, but would be nice to see data within a couple hours. Often have to wait to the next day.
I think support is an area where Crazy Egg is lacking. I would love to have a quarterly check-in with a Crazy Egg rep to understand what kinds of changes have been made to the platform and what is on the horizon. I also think a quick consulting sessions with a rep could be extremely beneficial, as I'm sure there are ways to use the tool that we haven't even thought about yet that would be extremely insightful for our team.
I will say that I didn't evaluate or select Crazy Egg, it's been a legacy tool that has been at the company before me. Honestly, we're not even sure of all of the features/functionality that we can use. Me, as a UXR, I think there are some other tools that would help me more in gaining visibility into what our users are doing on our website. I've evaluated other tools that are more aligned with UXR. However, if we properly paired it with experimentation, this might be more of a valuable tool for us.
UsabilityHub provides very fast, short responses to specific questions about a static image of a website. This is useful for checking what is most prominent on a page, what they would click on, what they see/read within the first 5 seconds of landing etc. WhatUsersDo is a broader tool, that records the screen and audio as a user navigates the website. You can set tasks and ask questions, but it much more about the user journey experience and their opinion, rather than testing a particular feature. Feedback also takes a bit longer. Hotjar is a combination of both, its a screen recording which helps you to see where users click and move to, but there is no audio or text feedback, just heatmaps/click maps for watching user behaviour.
Its reliability (not scaleability, as the question asks for, sorry) is pretty good but through our testing we know that some clicks do not get recorded. It doesn't bother us a lot because we look at the aggregate of thousands of visits, but we do know it misses things. As for scaleability, it's about right. You really don't want zillions of clicks per snapshot - the screen just turns to 100% dots and you lose the ability to differentiate different screen areas. We find that 25,000 clicks for a page gives us a really good view.