them,” he explains. “Individuals that
interact online through ratings, reviewing, collaboration, or shopping trust
each other on the basis of the traceable
activity they leave behind. The easier
it is to track them, the more trust and
credibility are generated.” He cites eBay
as a good example of how that can be
implemented: users can rate both buyers and sellers, and their overall rating
is visible to all.
Amazon, on the other hand, does
not do as good a job, in Matei’s view.
“The credibility of its reviewers is mostly a function of productivity,” he says.
Amazon does provide links to let users
examine a reviewer’s other reviews, and
notes whether the reviewer is known
to have purchased the item being reviewed (although, of course, if the reviewer purchased it elsewhere, Amazon
would not know about it). Some reviewers are in the site’s “Hall of Fame,”
meaning other visitors have rated their
reviews as helpful, “but the exact algorithm and the numbers behind it are
not immediately available,” says Matei.
Matei also points to Wikipedia as a
site that, despite its size and popularity, has its credibility questioned because no one really knows who its authors and editors are.
Another approach to promoting
trustworthy reviews is illustrated by
Reddit, a community where the visibility of content is determined by up-and-down votes from its users. When a user
visits a Reddit community, what they
see first is based on the opinion of the
other members about its relative value.
That can be distorted, of course—
people tend to vote down comments they
disagree with, regardless of their quality—but in general, it represents a consensus about the value of content.
It is vital for e-commerce sites and
social media communities for par-
ticipants to be confident the reviews
they read are authentic. As more and
more people do their shopping on-
line and rely on strangers’ recom-
mendations, the ability to filter out
fraudulent reviews, whether left by
a business’s competitors or by non-
customers hired for the purpose, be-
comes of critical importance. Com-
puter science can help with ways to
both identify and sequester fraudu-
lent reviews and to promote trust-
worthy ones, and sites can imple-
ment measures to help visitors track
reviewers’ activities and judge which
are the most reliable.
Bertino, E. and Matei, Sorin Adam, eds.
Roles, Trust, and Reputation in Social Media
Knowledge Markets, Springer (2014).
Hu, N., Liu, L., and Sambamurthy, V.
Fraud Detection in Online Consumer
Reviews, Decision Support Systems, vol. 50,
no. 3 (2011).
Sentiment Analysis and Opinion Mining
(Synthesis Lectures on Human Language
Technologies), Morgan & Claypool (2012).
Liu, B., Mukherjee, A., and Glance, N.
Spotting Fake Reviewer Groups in
Consumer Reviews, Proceedings of the
2012 World Wide Web Conference, http://
Luca, M. and Zervas, G.
Fake It Till You Make It: Reputation,
Competition, and Yelp Review Fraud,
Harvard Business School NOM Unit Working
Paper No. 14-006 (2013).
Logan Kugler is a freelance technology writer based
in Clearwater, FL. He has written for over 60 major
© 2014 ACM 0001-0782/14/10 $15.00
or algorithm, following the activity of a
reviewer is how you know whether you
can trust his or her reviews.
What to Do with Fake Reviews
All these techniques can suggest a review is fraudulent, but by themselves
they do not solve the problem of how
a site can make certain its visitors can
trust the reviews it hosts. One approach
to resolving that doubt is exemplified
by Yelp, which separates recommended
reviews from those that do not satisfy its
screening standards. The latter are still
available for users to read, but they are
marked as “not recommended,” with
an explanation of what that means.
Other approaches involve limit-
ing who can write reviews in the first
place. Ben Shneiderman, a professor
of computer science at the University
of Maryland, contributed the chapter
“Building Trusted Social Media Com-
munities: A Research Roadmap for Pro-
moting Credible Content” to the book
Matei edited. He writes, “Many strate-
gies are being tried to ensure that only
trusted contributors participate, such
as raising the barriers to entry for con-
tributors by requiring a log-in (no anon-
ymous contributions), identity verifi-
cation, background check, probation
periods, and public performance his-
tories. Greater transparency about who
the contributors are and what their past
is has the potential to increase trust in
their future contributions.”
Transparency is a vital element, in
the view of Purdue’s Matei. “Trust and
credibility are the product of the trans-
parency of a user’s activities multiplied
by the speed and the cost of verifying
One approach to
resolving that doubt
is exemplified by
Yelp, which separates
reviews from those
that do not satisfy its
You can now order many
popular ACM conference
proceedings via print-on-demand, through
Amazon, Barnes & Noble,
Baker & Taylor, Ingram
Now Available via
For available titles and
ordering info, visit: