The computer was much more accurate than humans in detecting fake reviews (90% vs 60%). Humans are just bad at telling when people are lying.
NPR, Aug 28, 2011
Online Review Too Good To Be True? Sometimes It Is
From local plumbers to luxury hotels, just about everyone selling a service these days has an online reputation. Increasingly, that reputation is shaped by online reviews. Customer ratings on sites such as Yelp and Urbanspoon can, for example, make or break a new restaurant.
It's no wonder, then, that some businesses are trying to fake us out. On Craigslist and online forums, posters are offering to buy and sell gushing reviews for just a few bucks; potential customers aren't able to tell the difference.
To help sort the genuinely delighted customers from profit-driven praise, Cornell University researcher Jeff Hancock and his colleagues have developed software that successfully unmasks fake online hotel reviews.
Hancock tells Laura Sullivan, guest host of weekends on All Things Considered, that too many bogus ratings could undermine the system.
"It gets at the very basic idea of what these reviews are about: trust," Hancock says.
The researchers started by "training" their computer algorithm on both fake reviews written for the study and real online reviews. Their software then went head-to-head against real humans and summarily defeated them: The computer was 90 percent accurate while the humans were correct three out of five times at best.
It turns out humans are just bad at telling when people are lying.
Here is the paper
Finding Deceptive Opinion Spam by Any Stretch of the Imagination, by Myle Ott, Yejin Choi, Claire Cardie, and Jeffrey Hancock
See also New York Times article
In a Race to Out-Rave, 5-Star Web Reviews Go for $5 covering this research.