Why We’ve Rejected 69,968 Reviews and 165,744 Ratings (and Counting)

Megan Headley
Megan Headley
January 30, 2023
Blog, Reviews

Why We’ve Rejected 69,968 Reviews and 165,744 Ratings (and Counting)

*This article was updated on Jan. 20, 2023 based on new data from the TrustRadius team*

Let’s start by explaining why the word “trust” is in our name.

Trust is the most important factor to buyers who are reading and using reviews to help them decide what to buy. When TrustRadius was founded, trustworthiness was proving to be a challenge for consumer reviews, and the team knew trust would be especially important for professionals making high-dollar, mission-critical software purchase decisions for their organizations.

Years later, the way in which B2B software is bought and sold continues to evolve, with many studies showing that B2B buyers are more empowered to find information on their own and trust in vendor sales reps is declining, leading to a decline in influence. B2B software reviews are one agent of change, providing buyers with access to end users beyond their networks. The trust factor has indeed proven to be critical.

But simply having the word in your name doesn’t make it so. Here’s how our Research team makes sure we’ve got the most useful, high-quality, trustworthy content for business technology buyers.

Delivering what buyers trust

First, this is what buyers have told us they want in a review:

  1. To know they’re hearing from a real person who has used the software and has no conflict of interest
  2. To know that the person is experienced and knowledgeable about the product and maybe even other products in the space for comparison
  3. To understand the person’s use case to gauge how relevant the review is to them
  4. To know the reviewer is not being paid for a positive perspective, and that a collection of reviews is balanced and doesn’t come from only the happiest (or unhappiest) of customers

From day 1, TrustRadius has vetted each review prior to publication with the above in mind, knowing that if our reviews and our data aren’t trustworthy, we’re not helping buyers. We have actually rejected 47% of the reviews and ratings submitted to our site, for various reasons outlined below. These data points never saw the light of day, and never influenced buyers.

Indeed, we have seen individuals whose reviews we have rejected (because they had a fake LinkedIn profile or worked for the vendor, for example) go on to have their reviews published on other sites.

Building and protecting a trusted resource

To provide the best experience possible for buyers, here are some of the issues we watch out for, how often we reject them, and how we address them:

Conflicts of interest (.5%)

We reject reviews from employees of the vendor or a competitor. This is pretty obvious! We do allow reviews from resellers as long as they are balanced and constructive, since often resellers have great perspectives on how the product stacks up to others in the space. But their ratings don’t factor in the overall scores—we just keep their qualitative feedback and mark it as a reseller review, so that buyers can take their (often useful) feedback with a grain of salt.

Lack of experience (.7%)

We want the reviews on TrustRadius to be in-depth and helpful for buyers who are comparing different products for different use cases. If there’s no indication in a review that the user understands the product and how to use it, buyers can’t be expected to trust that review. 

Suspicious user (90.1%)

Every reviewer must authenticate through LinkedIn or a validated work email. This is table stakes and true of most review sites. The TrustRadius Research team also vets the profile to make sure the photo is real, the individual has connections and a real work history, etc. We do this to ensure that no one creates a fake profile (i.e. the vendor, the competitor, a freelancer for hire, or someone out to get thank-you gift cards for writing multiple reviews of the same product).

Poor quality (2.9%)

We reject reviews that don’t offer detailed insights. Most of the reviews we have rejected are not nefarious—they’re not from fake identities or vendors or competitors. They’re from real users who just didn’t take the time to write a detailed review. If you can’t share real details about how you and your organization are using the product, then your review is not going to be very useful to buyers. Reviews on TrustRadius are usually over 400 words, compared to 50-100 for most other sites. Many reviewers that we reject for lack of detail actually come back and write a great review, based on our feedback about what’s unclear in the review.

Plagiarism (3.4%)

Occasionally review writers will copy material from elsewhere—often the vendor’s own marketing materials, or discussions in a community forum for the product. No thanks! This is not useful to buyers. They want to hear from real users about their real experiences.

Other (3.4%)

This is a broad category that can contain any of the following:

Personal vendettas

Occasionally (but pretty rarely) we get a review that reads more like a personal rant rather than a review of a product. Buyers don’t trust this kind of content or find it useful. At the very least, buyers will want to know why the reviewer got involved with the vendor, i.e. what seemed attractive about the product in the first place, as well as how expectations weren’t met. So in this case, we encourage the reviewer to provide detailed feedback about the product in addition to their negative experience with the vendor before we publish the review.

Screenshots

Some review sites ask reviewers for screenshots of their use of the product. This is actually something we don’t do. Screenshots are easily doctored, found online, or may contain sensitive business information. We find that a better indication of someone being a real user is the level of unique detail they offer in the review.

Vendor cherry-picking

Vendors love to invite their happiest customers to review their product. And there is nothing wrong with reviews from happy customers—they usually offer very useful and balanced feedback, including substantial suggestions for improvement. But if vendor-driven advocate reviews represent a significant portion or all of the reviews of a product, then the results are skewed and not trustworthy to buyers. So we label each review with information about how the individual was invited, and whether they were offered a thank-you incentive. We’re the only review site to have tracked the source of every single review since inception.

We also use a proprietary algorithm called trScore(™) that eliminates selection bias from the product’s overall score. 

Quantity

We’ve already mentioned the pitfalls of having an overall quantity of reviews be a driving goal for the company, but that said, it is important to have a variety of perspectives. So for any product to be on a TrustMap or in a Buyer’s Guide, it needs to have at least 10 reviews and ratings, if not more.

However—and this is really important—the number of reviews of a product does not factor into its position on our TrustMaps. We have seen other review sites use this as a tactic to get vendors to drive more and more reviews, promising incremental improvement in their placement on a 2×2 grid with each new review. But this is misleading to buyers. The fact that you can drive more reviews of your product doesn’t mean it’s a better solution. And it’s an unfair playing field for vendors—for example, it puts enterprise-focused vendors who might have fewer users, or new entrants in a well established category, at a disadvantage.

Incentives

Using small thank-you rewards is common practice in gathering business software reviews. Offering these gifts helps increase response rates (especially from those in the middle, neither advocates or detractors, who are less likely to write a review unprompted). It also produces higher quality reviews, since people are willing to spend more time on their answers. But offering gifts in exchange for positive reviews is strictly against our policies and, if we suspect a vendor is engaging in this practice, we swiftly remove those reviews from our site.

Additionally, offering thank-you rewards puts a review site at risk of attracting fake reviewers – people who set up a fake profile to write a review of anything, regardless of whether or not they use the product. This is why we are significantly less aggressive than other review sites at promoting these offers. We don’t offer gift cards broadly, rather we reserve those offers for individuals we have already vetted and know to be real users.

Maintaining buyer trust

We at TrustRadius are learning as we go as well, keeping a pulse on what buyers want from reviews and trying our best to meet those needs. But from the beginning our position has been that crowdsourced perspectives are highly useful to buyers, and that trust in the content is more important than the size of the crowd.

If you have any feedback on these practices, or are interested in learning best practices for sourcing reviews of your product, we’d love to hear from you. Please contact me at megan@trustradius.com.

About the Author

Megan Headley
Megan Headley
Megan leads Research at TrustRadius, whose mission is to ensure TrustRadius delivers high quality, useful and, above all, trustworthy user feedback to help prospective software buyers make more informed decisions. Before joining TrustRadius, Megan was Director of Sales and Marketing at Stratfor, where she was in charge of growing the company’s B2C revenue stream through email marketing and other channels. She enjoys traveling, reading, and hiking.