Brain-computer interfaces have the potential to change the way we
use devices, and there are at least four methods for implementation.
By Evan Peck, Krysta Chauncey, Audrey Girouard, Rebecca Gulotta,
Francine Lalooses, Erin Treacy Solovey, Doug Weaver, Robert Jacob
Science fiction has long been fascinated by brain-computer interfaces (BCI)—the use of sensors to identify brain states. From Andre Maurois’ 1938 story “The Thought- Reading Machine,” in which a professor stumbles on a machine that reads people’s thoughts, to the recent blockbuster Avatar, where humans control surrogate bodies
with their minds, the public is captivated by the interaction between the human brain and the
computers created by those brains.
Although most people are likely to
conjure images of Neo’s frightening
“head port” from The Matrix before they
dream of a university student wearing
an elastic cap studded with electrodes,
the media has closely followed less
sinister, if also less all-powerful, BCI
research. In the past year, University of
Wisconsin-Madison’s Brain-T witter interface received Time Magazine’s honor
as the no. 9 invention of the year. Furthermore, as brain-imaging technology has become more portable and less
expensive, the human-computer interaction (HCI) community has begun to
bring science fiction closer to reality.
In the larger field of human-computer
interaction, we are often concerned
with the bandwidth of interaction between a user and the computer. How
can we give the computer more information, and more relevant information? How can the computer give us
more information without overloading
our sensory systems?
Using a mouse in addition to a key-
board increases the bandwidth from
the user to the computer by augment-
ing the type and number of commands
the computer can recognize. An appli-
cation that uses audio increases the
bandwidth from the computer to the
user, by adding to the type of informa-
tion the computer can output. Seen in
this context, brain-computer interfac-
es present an opportunity to expand
the user-to-computer bandwidth in a
unique and powerful way. Instead of
identifying explicit actions, we can de-
tect intent. Instead of evaluating action
artifacts, we can recognize purpose.
Even more interesting, we may be able
portable and less
expensive, the HCI
begun to bring
science fiction closer
to understand the user’s needs before
the user can articulate them.
But this is all far in the future. On
the wide continuum between analyzing electroencephalo-graphs to Avatar
mind-machines, where are we now?
And perhaps more importantly, where
are we going?
In this article, we will discuss several directions for research into brain-computer interaction, and the relative
merits using these brain measurements to give the user direct or passive control of computer interfaces. We
will also introduce projects across the
world that offer a glimpse into the future of BCIs.
Imagine the following scenario:
It’s 9: 30 p.m., and you’re driving in
heavy traffic through Boston, unsure
of where you’re going. Your phone is
alerting you of text messages that your
sister is sending every 30 seconds. Your
GPS is commanding you to turn in half
a mile, then a tenth of a mile, but still,
you cannot tell which of the six exits off
the rotary to take. To make things worse,
the radio commercials are blaring,
but you are focused on the road, and