Fake reviews are killing our ability to google for the truth.
In this Tech First Draft, I chat with TrustRadius CEO Vinay Bhagat about fake reviews, the company’s new TRUE program, and why bad reviews aren’t necessarily bad for sales.
- See below for full video or links to the podcast
- Keep scrolling for the full transcript
Interestingly, when I was prepping for the show, I found that two of the top nine Google suggestions for “fake reviews” are “fake reviews generator” and “fake reviews generator free”
We chat about:
- Talk to me about the scale of the fake reviews problem … how prevalent is it?
- What are the different kinds or levels of “fake” reviews?
- Have you talked to people who have made big decisions based on fake reviews and later regretted it?
- You’ve introduced something to stop fake reviews called TRUE … what is it?
- You say your algorithm is un-gameable. How so?
- What makes a review trustworthy? What makes it valuable?
- Can a bad review lead to a sale?
Listen to the podcast here:
Or, listen and subscribe wherever podcasts are published:
- Apple podcasts
- Google podcasts
- Spotify podcasts
- And multiple other places … see them all on Anchor
You can also watch on YouTube
Or, of course, you can read the full transcript here …
Read the transcript: killing fake reviews
John Koetsier: Fake reviews are killing our ability to google for the truth. I googled this morning, and two of the top 9 google suggestions for “fake reviews” are “fake reviews generator” and “fake reviews generator free.” So we know that it’s hard to trust reviews for consumer products. What about reviews for B2B software?
If I buy something at Amazon, or somewhere else that’s a consumer good, I might spend $50. I might spend $100, maybe $500. If you’re buying B2B software it could be $500,000, and could easily be more with multi-year contracts. So today we’re chatting with TrustRadius CEO Vinay Bhagat about a new program to kill fake reviews. Vinay, welcome!
Vinay Bhagat: Thank you, I’m glad to be with you.
John Koetsier: Wonderful. Super happy to have you. It’s been a while since we’ve chatted, so great to see you again. Talk to me first about the scale of fake reviews. How prevalent is the problem? How big a deal is it?
Vinay Bhagat: It’s a significant problem in the B2B industry. We end up rejecting about 23% of the reviews that are submitted. We do so for a variety of reasons, certainly fake reviews is part of the problem, quality of content is a secondary problem as well.
John Koetsier: Interesting, interesting. So 23%, I’m going to guess that’s bigger in B2C, but in B2B that’s big enough. Talk to me about the different levels or kinds of fake reviews. I mean, you could have incentivized reviews, you could have totally fabricated reviews, what do you see in the market?
Vinay Bhagat: Yeah. About 40% of the reviews we reject are completely just suspicious. They look outright fraudulent. We see people … now to write a review on TrustRadius you have to use either a corporate email address which is very hard to fake, or a LinkedIn ID which is possible to fake. We do see people creating fake LinkedIn IDs. We also see people along with creating fake LinkedIn IDs, plagiarize content from other sites and try and write content that appears to be real, but when you really dig in and scratch beneath the surface you can tell that it’s fake.
So in addition to sort of just the technical authentication, we put every review through a human screening as well where we have a team of people read the reviews, check for plagiarism, make sure there’s no conflicts of interest, make sure the reviews are written by someone who’s likely to be using that type of product, and also spot-check other things that we view as sort of tells that a review may be fake. If a reviewer is suspected of being fraudulent as well, one of the other things that we’ve done is create honey traps where we actually create fake products and invite cohorts of reviewers that we think might be contributing fake reviews to actually review those products. And if they do, then it’s a sure tell.
John Koetsier: Wow, wow. Interesting. It’s a sting operation!
Vinay Bhagat: Absolutely. It works really, really well. Incredible.
John Koetsier: Super interesting, I had no idea. Now that sounds like a lot of effort. I mean, talk to me, I don’t know if you’ve quantified it, you probably have quantified it, but if you look at an average review that comes in, how much time are you spending with that review to ensure that it’s verified before accepting it?
Vinay Bhagat: Probably about 15 minutes for each review. It is a material amount of time but the stakes are high in B2B. This isn’t about picking toothpaste. This is about picking a piece of software that you can run your business. A, it could be really expensive. B, it can be catastrophic if you make the wrong decision and it can be career limiting for you if you make the wrong call as well.
I started TrustRadius because we bought a piece of enterprise tech for my last company, cost us $200,000 was rolled out to 450 people, and we made a bad call. It wasn’t because of a fake review, it was because we didn’t have the right intelligence, but the consequences for making a bad call are really quite catastrophic.
John Koetsier: Interesting, and that was one of my questions I was going to have for you, I mean, have you talked to somebody who has made that kind of bad call maybe based on bad reviews or other data or something like that? Sounds like you were that person or you were one of those people. Have you talked to others who have made $100,000, $500,000 decisions based on bad reviews and come to regret it?
Vinay Bhagat: Yeah, we actually see in our own review data and outside there are people who sort of voice regrettable decisions, or voice that they found the product having been through a journey of other products that didn’t quite meet their needs. So that’s a very common piece of content that we see surface in the reviews on our site and in the one-on-one interviews that we do with buyers and reviewers on our site. Those experiences are fairly common.
Another experience that’s even more common is how people actually make no decision or protract a purchase process if they don’t feel confident in the data that they are getting. Now that could be either because they see reviews and they look fraudulent, or perhaps because there isn’t enough data to make a decision. In fact, in our research 77% of buyers say it’s really important for them to understand the cons before they buy a product. They want to know what they’re getting into. They want to know whether it’s going to work for their use case and what the real world experience is going to be. So maybe a bigger challenge than making a bad decision is making no decision at all.
John Koetsier: Interesting, interesting. So you’ve introduced something to stop fake reviews. Obviously you’ve been doing a lot of vetting for years now, but you’ve introduced something new that you call TRUE. Can you talk to us about that, what that is?
Vinay Bhagat: Yeah it’s interesting, in B2B fake reviews and biased reviews are an issue, but in B2B perhaps a bigger issue than outright fake reviews is where vendors cherry pick who they invite to give reviews. It’s a very common practice for vendors to run NPS surveys and peel out the people who give them a 9 out of 10 and a 10 out of 10, and then drive them to review sites. For quite a while now we’ve seen that as an issue and it’s impacted the entire industry. It’s suddenly where we’re not the only site that has seen this phenomenon.
About three years ago, we introduced an algorithm called trScore which corrects for that bias, where if we have a review or a rating that we have sourced completely independently through our community. In other words, converting the people who are using our sites, contributors, we rate that in an algorithm a hundred times more so than a review or a rating driven by a vendor, just because statistically we have seen a great degree of bias in the samples that vendors send us. Now if a vendor becomes a good actor, we’ve always had sort of an approach where the vendor can prove to us that they have been open in their sourcing methodology and invited every customer to review them. We would actually change how that algorithm worked for them and give them full scoring as it were for that data that they submit. But the most of the market is still sort of over in the camp of cherry picking who writes reviews.
So the true campaign is really more of a motivation and a recognition strategy to make vendors do the right thing, which is invite all of their customers, without bias, to write reviews and to remove the cherry picking. The value of that to the market is that you can trust that not only are the reviews real and legitimate and not fraudulent, again with all the protections we have against fraudulent reviews, but you can trust that that data is reliable and reflective of reality and not just a biased sample of happy customers.
The vendors who embrace this really gain in two ways. One, they differentiate themselves in the market as being open to sharing kind of holistic feedback about that product experience. Hopefully that leads to more trust and faster deal cycles and higher conversion, but it also means they’re going to actually listen more attentively to their customers because if you only listen to your happy customers you’re not listening. You gotta have those neutrals and even the negative viewpoints out there in order for you to really be an effective listening company and improveme your product.
John Koetsier: It’s super interesting that you’re talking about that because that’s been a longstanding practice in mobile apps where you will have a little pop-up that comes, ‘How are you liking it? How are you liking this app?’ And if you rate it five stars or four stars, then it’ll transfer you over to the app store or to Google Play to write a review or something like that.
It’s also interesting because a friend of mine, Robert Scoble, says that when he’s looking at ratings and reviews he’s actually looking at the one star reviews. He’s getting the most benefit out of those and if he feels like, hey, they had a problem with it that I won’t have, or they’re looking for something that I’m not looking for in this product, then he’s not too worried about that one star. So it’s kind of interesting to hear you say how you’re putting that together. Really, really cool stuff.
Vinay Bhagat: You know we’ve done a lot of primary research about trust factors, after all “Trust” is in our name, but what buyers tell us is they’re really looking for balance. Many of them will do exactly what you’re saying, go and look at that one’s and two’s, and rule them out if they have a use case that’s grossly dissimilar to theirs. Others will filter the reviews to try and find people with the use case that’s similar to them, and others will look at the middle of the pack where they’re looking for someone who has a very balanced viewpoint. That balance is really critical.
In fact, of the top 10 indicators of what matters to buyers in terms of whether they can trust a review site balance is the number one factor. They also look to the experience of the buyers, of the reviewers, to see if they have a background similar to themselves. They look for detail and specificity. They look for the quality of the feedback. They look at whether the grammar and the authenticity feels real, and they look at volume too. But I think buyers are are more discerning than most people think. I think they can see when something just smells bogus.
John Koetsier: Yes. It’d be interesting to know the average rating on a review site. That would be very interesting to see.
Vinay Bhagat: Well on TrustRadius it’s around 3.9 out of 5. And in the consumer space it’s sort of around 4-4.1 for Yelp and TripAdvisor. And some of the B2B sites that don’t correct for the bias, on average we see higher scores on those sites for many products than we see on our site just because they haven’t taken that step of correcting for that bias.
John Koetsier: Makes sense. So one thing that you’ve said is that your algorithm is ‘ungameable.’ That’s a pretty bold statement because people try hard to game things. We know this, we’ve seen it in a lot of different areas. Talk to me about how you have the confidence to make that statement.
Vinay Bhagat: Well we are religious about making sure we know the source of every review and rating on our site, whether it’s contributed by a vendor or sourced by us. We can classify every review and rating into those cohorts. And then because we take such a stark view in terms of over-weighting an independently sourced data point a hundred times more than a vendor driven data point, it’s really, really hard for a vendor to correct that factor. You know no system is completely infallible. We absolutely have found at points in time, fake reviews get published on our site as well. I think we just take the extra mile to make sure that that doesn’t happen and we correct it quickly if we have suspicions, again through techniques like the honey trap mechanism where if we have a bunch of reviewers that we have questions around we want to put them through that extra level of scrutiny. So no system is 100% fallible. I would just say that we’ve really taken it upon ourselves to go far further than anyone else has and the TRUE campaign, again, is not really centered around fraud per se, but around what I think is actually an even bigger problem in B2B, which is gaming of the system where introducing bias into datasets makes it hard for buyers to trust in the data when they’re comparing products.
Now, scores alone don’t tell the whole picture. We have, in fact in our research have learned that buyers actually look beyond scores and actually care a lot more about what people have to say qualitatively about a product. But at a high level we want our scores to be reliable and reflect reality and not a biased dataset. We’ve been really pleasantly surprised by how … we knew buyers were going to react positively to this move, but we’ve been pleasantly surprised by the number of leading tech brands who say, ‘Stop the insanity, we don’t want to be part of this game either. We think someone needs to step up and create a foundation for level playing.’
John Koetsier: Interesting. And that segues quite well to another question I had for you, which is, you know, all brands are worried about that one customer that is not happy or that 15% of them or whatever the number is, but that’s somebody who didn’t really try it out, or it didn’t work, or it didn’t work for them for whatever reason, and they get a very bad review. Can a bad review kill a sale? I mean, talk to me about that fear from a brand.
Vinay Bhagat: An understandable fear. The reality though is if people don’t see any balance in the commentary on a site, they’re not going to trust any of the data. So the presence of one or two isolated very negative reviews might need to be something that you have to explain, but by the same token, buyers understand that there are anomalous use cases that may not be the same as their’s, but maybe they understand there were weird configurations or weird one-off scenarios, and it’s a lot to do with how the brand engages with that person.
Do they write a response? Do they try and triage them? We’ve actually had customers give a very negative review to a product. We’ve seen the brand engage and then the reviewer has been prompted to update their review and change their score, and maybe they were just in need of a bit of triage. I think if you’ve got a large enough data set, let’s say a hundred reviews, you’ve got one or two isolated instances, there’s an opportunity for both buyers to sort of look at the holistic picture and say ‘these are isolated’ and there’s also an opportunity for the vendor to triage those instances and correct them.
John Koetsier: That seems to be an answer to this viewer question as well. John Westra was saying ‘are emotionally compromised or irrational reviews weeded out?’ And it sounds like there’s a process, at least for … obviously there’s some oversight, but there’s also a process for a brand to respond.
Vinay Bhagat: Yeah, I mean when we process reviews one of the things that we look for is sort of a quality factor. We grade every review an A, B, and C and in terms of the reviews that present first on our pages, it’s the ones that are actually both recent but also graded high. And in terms of our grading factors, we look at the level of qualitative insights. So someone who’s either just purely ranting or raving but not providing any substance, at the extreme end that review is not going to get published. We return about 7% of reviews that we think are legitimate but just don’t meet our quality bar to have reviewers edit them and resubmit them. Of course, some people choose not to resubmit then they’re rejected, but some portion of people will augment their review and then there’s a minimum threshold of quality of insight for the review to get published to C grade. And again, the reviewers that have more insight that spoke more to factors like use case and can provide some depth and substance are the ones that get prioritized first in terms of what gets displayed.
John Koetsier: Super interesting. Well, this has been interesting. Thank you so much for being with us. I really appreciate it. Very interesting to see just what all goes into a rating on a site like TrustRadius and others like that. Much appreciate your time.
So thanks for joining us on Tech First Draft. Whatever platform you’re on please like, subscribe, share, comment. If you’re on the podcast later on, please rate it and review it. Thank you so much. Until next time … this is John Koetsier with Tech First Draft.