UserTesting helps UX researchers, designers, product teams, and marketers gather actionable insights through research, testing, and feedback. With a network of real people ready to share their perspectives, UserTesting enables organizations to make customer-first decisions at scale.
I will say that I didn't evaluate or select Crazy Egg, it's been a legacy tool that has been at the company before me. Honestly, we're not even sure of all of the features/functionality that we can use. Me, as a UXR, I think there are some other tools that would help me more in …
The real-time data from FullStory is great and we wish UT had a similar feature, but the ease of setting up individual tests and qualitative feedback is more important for us at this stage. We chose UT over FS when our budget forced us to drop one.
+ I strongly believe that this tool helps when a firm has good user count (depends on business model) as most of these tools are data friends. More data - more valuable insights+ Best fit if someone who is looking for deeper insights of individual page - Not suggested for very fewer visits of a website. Suggested toimprove better visit count
Well suited to its original purpose- usability testing and interviews. This can be performed at pace, given the large audience (although our brands are very well known so this should not be a barrier) and there is a decent level of task customisation when conducting unmoderated testing. Its less appropriate for survey where you are looking to capture genuine intent/behaviour, even with screeners the data skews more positively than onsite survey, makes me question the quality of survey respondents.
Provides heatmaps that shows you the elements on your site that are and aren't performing well.
Provides scrollmaps so you can see how far down a page users are scrolling and which content never gets seen.
Screenshots show you how your website looks across a variety of different devices.
Provides a type of clickmap called confetti that enables you visualise clicks by segments - device, new/returning visitors, campaigns and other metrics.
The largest thing we've struggled with is the Optimizely integration. I've contacted customer service a few times to get it properly setup. Customer Service is always friendly and helpful; they provide clear steps to get it setup. Unfortunately despite clear instructions, they are tedious, and if not completed in the correct order, the integration with Optimizely does not work. My success rate with the integration is less than 55%.
Quality of participant pool - many are career testers, and many are untruthful. Since sessions are auto-scheduled if the screener is past, you often don't know until they've completed the test. Allow double screening or be more stringent in removing users from the platform.
Unfinished products - focus on making one product the best it can be before moving on to a new one. Unmoderated testing is still missing features (randomization of 3 or more prototypes, etc.)
It's a great tool considering how inexpensive it is. If used correctly and you have a plan for tracking your websites, this tool can make a world of a difference. If you are not going to sit down and take the time to make a plan for how to use this tool, I would say it is not worth your time. Yes, you can look at items on your website that need to be changed, but without a consistent plan, other important items that need changing can be lost in the mix. Make sure you have enough time and energy to invest in this and it will be well worth it
I'm very happy with my experience of the product and the level of service and learning resources they provide. If the service becomes more expensive than it currently is then we might not be able to justify additional cost - but this is theoretical. I would recommend UserTesting and would ideally renew our contract.
Crazy Egg is extremely easy to set up and use, and very well done from a user experience standpoint. It is really helpful that I can give stakeholders access to the interface and get them interacting with it with minimal training. The A/B testing is the easiest I have ever used, with minimal performance impact to the website.
It can be difficult to organize our tests and go back and find information. I think the AI tools are helping and will help with this, but for now it is time consuming to sort through all of the tests and information and then synthesize it and share it with others. It just takes a lot of time.
It's slow to post data, and slow to get a snapshot to finally be active (i.e. not pending). Not intolerable, but would be nice to see data within a couple hours. Often have to wait to the next day.
I think support is an area where Crazy Egg is lacking. I would love to have a quarterly check-in with a Crazy Egg rep to understand what kinds of changes have been made to the platform and what is on the horizon. I also think a quick consulting sessions with a rep could be extremely beneficial, as I'm sure there are ways to use the tool that we haven't even thought about yet that would be extremely insightful for our team.
I have contacted UserTesting's customer service online, by email, or by phone a few times, and each time, I have encountered the same professionalism and expertise. Even in person during a work event, they were there, and it was the same experience.
From a technical perspective, the implementation was extremely smooth. Most of the change management / implementation hurdles were clearing use of the tool through our various security, legal, and information privacy teams. Once these concerns were addressed (UserTesting.com was very helpful in providing all the needed documentation), the implementation process was very simple and we were able to get going right away.
I will say that I didn't evaluate or select Crazy Egg, it's been a legacy tool that has been at the company before me. Honestly, we're not even sure of all of the features/functionality that we can use. Me, as a UXR, I think there are some other tools that would help me more in gaining visibility into what our users are doing on our website. I've evaluated other tools that are more aligned with UXR. However, if we properly paired it with experimentation, this might be more of a valuable tool for us.
The quality of the participants: they usually have good feedback and act like "professional" users. Which is good when we want a few insights in a short amount of time. Also, the interface is good. I miss having more features, like a good transcription tool like we have in Condens
Its reliability (not scaleability, as the question asks for, sorry) is pretty good but through our testing we know that some clicks do not get recorded. It doesn't bother us a lot because we look at the aggregate of thousands of visits, but we do know it misses things. As for scaleability, it's about right. You really don't want zillions of clicks per snapshot - the screen just turns to 100% dots and you lose the ability to differentiate different screen areas. We find that 25,000 clicks for a page gives us a really good view.