Science | DOI: 10.1145/1897816.1897822
Neil Savage
information theory
After shannon
Purdue University’s Science of Information Center seeks
new principles to answer the question ‘What is information?’
In a sense, Claude Shannon invented the Internet. An elec- tronic engineer at Bell Labs, Shannon developed informa- tion theory in “A Mathematical Theory of Communication,” a
landmark paper published in 1948. By
quantifying the limits to which data
could be compressed, stored, and
transmitted, he paved the way for high-speed communications, file compression, and data transfer, the basis for
the Internet, the CD and the DVD, and
all that they entail. Now scientists at
Purdue University, with a $25 million,
five-year grant from the U.S. National
Science Foundation, have created the
Science of Information Center with the
goal of moving beyond Shannon. They
aim to develop principles that encompass such concepts as structure, time,
space, and semantics. These principles
might help design better mobile networks, lead to new insights in biology
and neuroscience, drive research in
quantum computing, and even aid our
understanding of social networks and
economic behavior.
“Whether we will build a new the-
ory or not remains to be seen,” says
Wojciech Szpankowski, professor of
computer science at Purdue Univer-
sity and leader of the project, which in-
cludes some 40 researchers at nine uni-
versities. “It’s definitely time, after 60
years, to revisit information theory. It’s
basically communication and storage
today, but we need to go beyond that.”
In Shannon’s theory, information,
which consists of bits, is that which
reduces a recipient’s statistical uncer-
tainty about what a source transmit-
ted over a communications channel.
It allows engineers to define the ca-
pacity of both lossless and lossy chan-
nels and state the limits to which data
can be compressed. Shannon theory
ignores the meaning of a message,
Wojciech szpankowski, project leader for the science of information center at Purdue uni- versity, is revisiting claude shannon’s information theory with a team of some 40 researchers.
focusing only on whether the 1s and
0s of binary code are being transmitted accurately. It doesn’t care about
the physical nature of the channel;
information theory is the same for a
telegraph wire or a fiber optic cable.
And it assumes infinite delay, so a receiver has all the time it needs to receive and recognize a signal.
A growing challenge
for information
theory is the field
of quantum
information and
quantum computing.
Szpankowski argues that information goes beyond those constraints.
Another way to define information is
that which increases understanding
and that can be measured by whether
it helps a recipient to accomplish a
goal. At that point, semantic, temporal, and spatial factors come into play.
If a person waiting for a train receives
a message at 2 p.m. saying the train
leaves at 1 p.m., the message contains
essentially no information. In mobile
networks, the value of information
can change over the time it takes to
transmit it because the person receiving it has moved; instructions to make
a left turn are pointless, for example,
if the driver has already passed the intersection. And there’s no good way to
measure how information evolves on
the Web. “We cannot even understand
how much information is transmitted
on the Internet because we don’t understand the temporal aspect of information,” Szpankowski says.
PHO TOGRAPGH BY VINCEN T P. WALTER