systems and their effects on consumer fragmentation.
Management Science 60, 4 (2013), 805–823.
[ 15] Weisberg, J. Bubble trouble: Is web personalization
turning us into solipsistic twits. Slate (June 10,
2011); http:// www.slate.com/articles/ne ws_and_
[ 16] Hampton, K., Sessions Goulet, L., Rainie, L., and Purcell, K.
Social networking sites and our lives. Pew Research,
June 16, 2011; http://www.pe winternet.org/2011/06/16/
[ 17] Tufekci, Z. How Facebook’s algorithm suppresses
content diversity (modestly) and ho w the ne wsfeed
rules the clicks. Medium (May 7, 2015); https://
[ 18] Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A.,
Caldarelli, G., Stanley, E., and Quattrociocchi, W. The
spreading of misinformation online. Proceedings of the
National Academy of Sciences 113, 3 (2016), 554–559.
[ 19] Mahesh wari, S. How fake ne ws goes viral: A case
study. The Ne w York Times (Nov. 20, 2016); http://
[ 20] Silverman, C. This analysis sho ws how viral fake
election news stories outperformed real news on
Facebook. BuzzFeed (Nov. 16, 2016); https://www.
[ 21] Kang, C. Fake news onslaught targets pizzeria as
nest of child-trafficking. The New York Times (Nov.
21, 2016); http:// www.nytimes.com/2016/11/21/
[ 22] Bender, B. and Hanna, A. Flynn under fire for fake
ne ws. Politico (Dec. 5, 2016); http:// www.politico.
[ 23] Jin, F., Dougherty, E., Saraf, P., Cao, Y., and
Ramakrishnan, N. Epidemiological modeling of news
and rumors on Twitter. In Proceedings of the Seventh
Workshop on Social Network Mining and Analysis.
ACM Press, New York, 2013, article no. 8.
[ 24] Kottasova, I. Facebook and Google to stop ads from
appearing on fake ne ws sites. CNN (Nov. 15, 2016);
[ 25] Heath, A. Facebook is going to use Snopes and other
fact-checkers to combat and bury ‘fake ne ws.’
Business Insider (Dec. 15, 2016); http:// www.
businessinsider.com/facebook- will-fact-check-label-fake-ne ws-in-ne ws-feed-2016-12
[ 26] Resnick, P., Kelly Garrett, R., Kriplean, T., Munson,
S. A., and Jomini Stroud, N. Bursting your (filter)
bubble: Strategies for promoting diverse exposure.
In Proceedings of the 2013 Conference on Computer
Supported Cooperative Work (San Antonio, TX, Feb.
23–27). ACM Press, New York, 2013, 95–100.
Dominic Di Franzo is a post-doctoral associate in the
Social Media Lab at Cornell University, Ithaca, N Y. He
holds a Ph. D. in computer science from the Rensselaer
Polytechnic Institute, Troy, N Y, and was a member of the
Tetherless World Constellation.
Kristine Gloria-Garcia joined the Aspen Institute
Communications and Society Program as a project
manager in September 2016; previously, she served as
a visiting researcher at the Internet Policy Research
Initiative at MI T, Cambridge, MA, and as a privacy research
fellow at the Startup Policy Lab. She holds a Ph. D. in
cognitive science from Rensselaer Polytechnic Institute,
Troy, N Y, and a master’s in media studies from the
University of Texas at Austin.
© 2017 ACM 1528-4972/17/03. $15.00
zations like Snopes and Politifact [ 25].
It is also developing better automatic
fake-news-detection systems that will
limit the spread of such content.
Researchers and software developers have been looking into tools to
help break out of filter bubbles [ 26],
including filtering algorithms and
user interfaces that give users better
control and allow more diversity. Other
tools (such as browser plugin Ghostery
and search engine DuckDuckGo) are
being developed to help anonymize users’ actions online, thus disabling personalized recommendations.
Bot and spam detection is another
major area of research. Many social-media platforms already use a range
of tools, from machine learning to social network analysis, to detect and
stop bots. Independent groups and researchers have also developed tools to
detect bots; for example, researchers
at Indiana University have developed
BotOrNot ( http://truthy.indiana.edu/
botornot/), a service that allows users
to check if a particular Twitter user is,
in fact, a bot.
In addition to technical enhancements
and design choices, what other av-
enues, even public policymaking, are
available for combating these issues?
This may be a particularly difficult
question in the U.S. due to free-speech
protections under the First Amend-
ment of the U.S. Constitution. We al-
ready see legal and political tension as
Twitter implements internal policies
for flagging hate speech and closing
specific accounts. Others have suggest-
ed a reinstatement of media- and civic-
literacy initiatives to help users discern
for themselves which news sources are
These issues of fake news and filter bubbles are vague, nuanced, and
pre-date social media, with no ready
solution, but it is vital that researchers continue to explore and investigate
them from diverse technical and social
perspectives. Their skills, knowledge,
and voices are needed more than ever
to address them.
[ 1] Isaac, M. Facebook, in cross hairs after election,
is said to question its influence. The New York
Times (Nov. 12, 2016); https://www.nytimes.
[ 2] Isaac, M. and Ember, S. For election day influence,
Twitter ruled social media. The New York Times (Nov.
8, 2016); http:// www.nytimes.com/2016/11/09/
[ 3] Kokalitcheva, K. Mark Zuckerberg says fake news
on Facebook affecting the election is a ‘crazy
idea.’ Fortune (Nov. 11, 2016); http://fortune.
[ 4] El-Bermawy, M. M. Your filter bubble is destroying
democracy. Wired (Nov. 18, 2016); https://www.
[ 5] Sigdyal, P. and Wells, N. Twitter users scream ‘leave’
in Brexit vote, but ‘remain’ gains ground. CNBC (June
23, 2016); http:// www.cnbc.com/2016/06/23/
[ 6] Mitchell, A., Gottfried, J., and Matsa, K. E. Facebook
top source for political ne ws among millennials.
Pe w Research Center, June 1, 2015; http:// www.
[ 7] Bond, R. M., Fariss, C. J., Jones, J. J., Kramer, A. D. I.,
Marlo w, C., Se le, J. E., and Fowler, J. H. A 61-million-
person experiment in social influence and political
mobilization. Nature 489, 7415 (Sept. 2012),
[ 8] Taylor, D.G., Lewin, J.E., and Strutton, D. Friends,
fans, and follo wers: Do ads work on social net works?
Journal of Advertising Research 51, 1 (2011),
[ 9] Hampton, K. and Hargittai, E. Stop blaming Facebook
for Trump’s election win. The Hill (Nov. 23, 2016);
[ 10] Fields, J., Sengupta, S., White, J., Spetka, S. et
al. Botnet Campaign Detection on Twitter. Master
of science thesis in computer and information
Sciences, Department of Computer Sciences, SUN Y
Polytechnic Institute, Utica, N Y, 2016; https://
[ 11] Kollanyi, B., Ho ward, P. N., and Woolley, S. C. Bots
and automation over Twitter during the third U. S.
presidential debate. Political Bots (Oct. 27, 2016);
[ 12] Pariser, E. The Filter Bubble: How the New
Personalized Web Is Changing What We Read and
Ho w We Think. Penguin, Ne w York, 2011.
[ 13] Bakshy, E., Messing, S., and Adamic, L. Exposure to
ideologically diverse ne ws and opinion on Facebook.
Science 348, 6239 (2015), 1130–1132.
[ 14] Hosanagar, K., Fleder, D., Lee, D., and Buja, A. Will the
global village fracture into tribes? Recommender
We already see
legal and political
tensions as Twitter
policies for flagging
hate speech and