DX Application Performance Management (formerly CA APM, or CA Application Performance Management) is an application performance management platform designed to correlate and analyze data in real-time. DX APM supports hybrid environments and customizable failure thresholds.
N/A
Lyssna
Score 7.7 out of 10
N/A
Lyssna (formerly UsabilityHub) is a user research platform used to test digital products with real users and gain insights into their audience. Its tools and features help Lyssna to optimize users' designs and create more engaging user-friendly experiences. Lyssna is a research platform, offering a broad range of testing features including: Five Second Testing - Used to quickly test the effectiveness of landing pages, messaging and designs by showing users a…
$0
per month (3 seats included)
UserTesting
Score 8.2 out of 10
N/A
UserTesting helps UX researchers, designers, product teams, and marketers gather actionable insights through research, testing, and feedback. With a network of real people ready to share their perspectives, UserTesting enables organizations to make customer-first decisions at scale.
N/A
Pricing
DX Application Performance Management
Lyssna
UserTesting
Editions & Modules
No answers on this topic
Free
$0
3 seats included
Starter
$99
per month 5 seats included
Growth
$199
per month 15 seats included
Enterprise
Contact Sales
custom seats
No answers on this topic
Offerings
Pricing Offerings
DX Application Performance Management
Lyssna
UserTesting
Free Trial
No
Yes
Yes
Free/Freemium Version
No
Yes
No
Premium Consulting/Integration Services
No
No
No
Entry-level Setup Fee
No setup fee
No setup fee
No setup fee
Additional Details
—
Discount available for annual plan. Panel responses are priced seperately.
Lyssna is certainly the least expensive, most basic and easy to use out of the range of usability tools I have used in the past. Depending on your maturity as a business and the projects that you are doing, this can be a great starting point before scaling up.
We have evaluated two other platforms - UserZoom and UsabilityHub. We ultimately decided to maintain our relationship with UserTesting due to the overall usability and the functionality that it offers. The features better suited our needs, and it met a price point that worked …
UserTesting is more robust. We also use UsabilityHub, but for different purposes - one off tests that don't require many screens but do require more responses.
CA Wily is well suited for monitoring the load balanced application nodes and determine their main performance bottlenecks like if problem exists in heap or in some connection issues. But I think when it comes to providing the entire report of the application, it is less appropriate as there is no way to create the entire report of the test for all the counters monitored. Also there is no way to detect some issues based on the system configuration.
UsabilityHub is well suited for remote unmoderated testing. Responses are captured very quickly and live updates allow the user to keep track of how the test is performing. The types of testing that make the most sense to use on UsabilityHub are preference test, first click test, navigational, and design surveys. It is less appropriate for one-on-one testing and lengthy questionnaires.
UserTesting has been great for moderated customer interviews/usability testing as well as for unmoderated testing of messaging, imagery, prototypes and live experiences. I would say that the scope of what you want needs to be limited, as the participants are only paid so much and tests are supposed to not exceed a certain amount of time. For customer interviews, I think it can be difficult to onboard customers to UserTesting if they have never used it before. If I set up interviews, I don't even have them use the UserTesting scheduling tool, I actually set up all the interviews with the customers myself through the tool (being mindful of time zones!). When we run the meeting, they really don't even know UserTesting is involved. Might be nice for UserTesting to allow the upload/connecting to of a Zoom interview and let it do the transcription/analysis from there.
Visualization of the metrics and data presented to provide a unique and attractive interface.
Flexibility to manage actions and commands that we deem important for each application. The ability to define these and customize the metrics reported for each individual application is huge for us.
We are facing challenges to meet RFP requirements in few areas where I would recommend CA to improve.
We are facing challenges for non (java and dotnet application) which CA APM does not support. SAP is another solution where CA APM is not able to address our business requirement.
Add additional demographic sorting options for the audience to better meet the needs of B2B users - for example include industry type, functional area, etc.
Sometimes there are restrictions around types of research that can be used for moderated user-testing with our own users.
For tests on relatively small areas of a website or app, the AI analysis seems rather overblown, like it's trying too hard to come up with something insightful when the test is actually about something quite small (e.g. structure of a mobile app menu).
It's difficult to invite our own users to unmoderated user-testing because they wouldn't know how the UserTesting interface works - this is particularly an issue for mobile research.
I'm very happy with my experience of the product and the level of service and learning resources they provide. If the service becomes more expensive than it currently is then we might not be able to justify additional cost - but this is theoretical. I would recommend UserTesting and would ideally renew our contract.
Due to its simplicity and design it is really easy to navigate. You can clearly understand which sections you have completed and which are still left to be done. It is also really easy to change ordering of content etc, which I have found hasn’t been an option in other tools which means it is a really lengthy task of rewriting all of the tasks or questions to get them in the correct order that is desired.
It's very good, I have used other tools in the past and this is by far the most intuitive and user friendly. Testament to this is the ease with which other non researchers who have been onboarded to the tool with our additional seat have found it easy to use
I have contacted UserTesting's customer service online, by email, or by phone a few times, and each time, I have encountered the same professionalism and expertise. Even in person during a work event, they were there, and it was the same experience.
From a technical perspective, the implementation was extremely smooth. Most of the change management / implementation hurdles were clearing use of the tool through our various security, legal, and information privacy teams. Once these concerns were addressed (UserTesting.com was very helpful in providing all the needed documentation), the implementation process was very simple and we were able to get going right away.
We only have 1 application using DX APM, while most of other applications are instrumented using Dynatrace. Dynatrace provides a better monitoring and instrumentation for applications. The required effort to setup monitoring capability using DX APM is way more work compared to Dynatrace, also the amount of KPIs offered from Dynatrace is more than DX APM.
UsabilityHub provides very fast, short responses to specific questions about a static image of a website. This is useful for checking what is most prominent on a page, what they would click on, what they see/read within the first 5 seconds of landing etc. WhatUsersDo is a broader tool, that records the screen and audio as a user navigates the website. You can set tasks and ask questions, but it much more about the user journey experience and their opinion, rather than testing a particular feature. Feedback also takes a bit longer. Hotjar is a combination of both, its a screen recording which helps you to see where users click and move to, but there is no audio or text feedback, just heatmaps/click maps for watching user behaviour.
The quality of the participants: they usually have good feedback and act like "professional" users. Which is good when we want a few insights in a short amount of time. Also, the interface is good. I miss having more features, like a good transcription tool like we have in Condens
Introscope is deeply utilized within the organization. However, CEM and Team Center not as much. Those that use one piece don't generally use the others. Partially because of the curve in learning how to use the consoles effectively.
reporting is pretty well configured and easy to setup if you know how to use the tools. So this can be easy to use and takes less time to configure for the different groups within the organization.