its accuracy, is highly popular and
endorsed by many, exerting an influence against which we haven’t yet developed antibodies. Our vulnerability
makes it possible for a bot to acquire
significant influence, even unintentionally. 2 Sophisticated bots can generate personas that appear as credible
followers, and thus are more difficult
for both people and filtering algorithms to detect. They make for valuable entities on the fake follower market, and allegations of acquisition of
fake followers have touched several
prominent political figures in the U.S.
and worldwide.
Journalists, analysts, and researchers increasingly report more
examples of the potential dangers
brought by social bots. These include
the unwarranted consequences that
the widespread diffusion of bots
may have on the stability of markets.
There have been claims that Twitter signals can be leveraged to predict the stock market, 5 and there is
an increasing amount of evidence
showing that market operators pay
attention and react promptly to information from social media. On April
23, 2013, for example, the Syrian
Electronic Army hacked the Twitter
account of the Associated Press and
posted a false rumor about a terror
attack on the White House in which
President Obama was allegedly injured. This provoked an immediate
crash in the stock market. On May 6,
2010 a flash crash occurred in the U.S.
stock market, when the Dow Jones
plunged over 1,000 points (about 9%)
within minutes—the biggest one-day
point decline in history. After a five-month-long investigation, the role of
high-frequency trading bots became
obvious, but it yet remains unclear
whether these bots had access to information from the social Web. 22
The combination of social bots
with an increasing reliance on auto-
matic trading systems that, at least
partially, exploit information from so-
cial media, is ripe with risks. Bots can
amplify the visibility of misleading
information, while automatic trading
systems lack fact-checking capabili-
ties. A recent orchestrated bot cam-
paign successfully created the appear-
ance of a sustained discussion about a
tech company called Cynk. Automatic
trading algorithms picked up this con-
versation and started trading heavily
in the company’s stocks. This resulted
in a 200-fold increase in market value,
bringing the company’s worth to $5
billion.b By the time analysts recog-
nized the orchestration behind this
operation and stock trading was sus-
pended, the losses were real.
The Bot Effect
These anecdotes illustrate the conse-
quences that tampering with the so-
cial Web may have for our increasingly
interconnected society. In addition to
potentially endangering democracy,
causing panic during emergencies,
and affecting the stock market, so-
cial bots can harm our society in even
subtler ways. A recent study demon-
strated the vulnerability of social me-
dia users to a social botnet designed
to expose private information, like
phone numbers and addresses. 7 This
kind of vulnerability can be exploited
by cybercrime and cause the erosion
of trust in social media. 22 Bots can
also hinder the advancement of pub-
lic policy by creating the impression
of a grassroots movement of con-
trarians, or contribute to the strong
polarization of political discussion
observed in social media. 12 They can
alter the perception of social media
influence, artificially enlarging the
audience of some people, 14 or they
can ruin the reputation of a com-
pany, for commercial or political
purposes. 25 A recent study demon-
strated that emotions are contagious
on social media23: elusive bots could
easily infiltrate a population of un-
aware humans and manipulate them
to affect their perception of reality,
with unpredictable results. Indirect
social and economic effects of social
bot activity include the alteration of
social media analytics, adopted for
various purposes such as TV ratings,c
expert findings, 40 and scientific im-
pact measurement.d
b The Curious Case of Cynk, an Abandoned
Tech Company Now Worth $5 Billion; mash-
able.com/2014/07/10/cynk
c Nielsen’s New Twitter TV Ratings Are a To-
tal Scam. Here’s Why; defamer.gawker.com/
nielsens-new-twitter-tv-ratings-are-a-total-
scam-here-1442214842
d altmetrics: a manifesto; altmetrics.org/mani-festo/
Act Like a Human, Think Like a Bot
One of the greatest challenges for
bot detection in social media is in
understanding what modern social
bots can do. 6 Early bots mainly performed one type of activity: posting
content automatically. These bots
were naive and easy to spot by trivial
detection strategies, such as focusing on high volume of content generation. In 2011, James Caverlee’s
team at Texas A&M University implemented a honeypot trap that managed to detect thousands of social
bots. 24 The idea was simple and effective: the team created a few Twitter
accounts (bots) whose role was solely
to create nonsensical tweets with gibberish content, in which no human
would ever be interested. However,
these accounts attracted many followers. Further inspection confirmed
that the suspicious followers were indeed social bots trying to grow their
social circles by blindly following
random accounts.
In recent years, Twitter bots have
become increasingly sophisticated,
making their detection more difficult. The boundary between human-like and bot-like behavior is now
fuzzier. For example, social bots can
search the Web for information and
media to fill their profiles, and post
collected material at predetermined
times, emulating the human temporal signature of content production
and consumption—including circadian patterns of daily activity and
temporal spikes of information generation. 19 They can even engage in
more complex types of interactions,
such as entertaining conversations
with other people, commenting on
their posts, and answering their questions. 22 Some bots specifically aim to
achieve greater influence by gathering new followers and expanding
their social circles; they can search
the social network for popular and
influential people and follow them
or capture their attention by sending
them inquiries, in the hope to be noticed. 2 To acquire visibility, they can
infiltrate popular discussions, generating topically appropriate—and
even potentially interesting— content, by identifying relevant keywords
and searching online for information
fitting that conversation. 17 After the