TrustRadius for Buyers is an essential component in my team's evaluation efforts
July 20, 2021

TrustRadius for Buyers is an essential component in my team's evaluation efforts

Thomas Young | TrustRadius Reviewer
Score 10 out of 10
Vetted Review
Verified User

Overall Satisfaction with TrustRadius for Buyers

The project dealt with choosing software to do advanced professional forecasting. Many products were reviewed, including open-source and proprietary tools such as SAS, STATA, Spotfire, R, Python, Forecast Pro, and others. My role was to lead the discussion and make final decisions on the chosen software. Others included in the discussion were colleagues and potential users. The ultimate decision was to use SAS Forecast Studio.
  • Easy to use
  • The reviews seem to be genuine
  • The reviews present pros and cons and discuss options. Real helpful.
  • I think it would be even more helpful to have videos of people actually using the software.
  • On some things, it didn't seem like the reviewer was as knowledgeable about the software as I would have expected.
  • Some reviews are too glowing. I like to see downside as well.
Ah, yes, some vendors will actually post pricing or provide pricing information when requested in a timely manner. Some vendors, on the other hand, are less than transparent about their software pricing, requiring me and the team to request pricing options more than once to get a real idea of what the software/project will cost. I think the more transparent firms tend to be the ones I like best.
I have used four of the software tools mentioned - G2, Capterra, Gartner Peer Insights, and SoftwareAdvice. I find written information from TrustRadius to be the best, but lacking in video offering. I often use more than one review site, and G2 comes in on top when I look for actual video footage of the reviewer. I trust the written reviews from TrustRadius more, though.
I use Gartner magic Quadrant and Forrester Wave reports for gaining insight into how software tools compare to each other on broad levels. They offer a relative performance and quality perspective and as such, were very helpful. In thinking about what I don't like about them, I don't usually trust their rankings and the information provided behind the graphics is often lacking, requiring further research from sites like TrustRadius.
In my experience, product ratings can have significant influence on the final product chosen. Typically, one of the software tools from the top of the rankings is chosen, although interestingly, often not the top tool. More often than not, the product rankings are used as a sifting tool to limit the number of software programs that I or the team review, kind of like the way job postings limit the number of resumes that have to be reviewed by requiring a masters degree.
I think the two biggest challenges when evaluating software are first, individuals with pre-chosen tools, and second, individuals that loaf off of the group. On the first challenge, the issue is typically unresolved throughout the evaluation, but one way to help in dealing with it is to require individuals with pro-chosen tools to physically evaluate alternative tools. On the second: the best way to deal with it is to require these individuals to come with suggestions to the meetings.
Although TrustRadius is imperfect, using TrustRadius was very helpful and continues to be helpful in making important team decisions. I think having TrustRadius at my disposal before talking with colleagues makes me more informed than others in the room. Because of the information advantage, I gain from using TrustRadius, I would recommend it to most anyone researching software products.