The notion of a wave is optimized for
resilience through massive scale, not for local efficiency.
turns out that the idea of a “flow” in the Internet is
not very different from a virtual circuit; indeed, with
so many stateful middle boxes (such as network
address translation, firewalls, proxies, and multiproto-col-label-switching switched/routers), one can now
say that the whole debate was futile.
However, the future needs of networks will be different from the simple circuit or packet system that
has dominated for the past century, as society shifts
from its historical patterns of work and entertainment. The vast majority of network content is
pull/interest-based; economics [ 1] argues for making
the “long tail” of content available at low transac-tion/network costs to large numbers of small groups
of only the most interested recipients.
The decrease in the need for synchronization in
remote activities for video, audio, and static content
argues that networks, including the Internet, be optimized for nonconcurrent use. On the other hand,
people want low latency, which argues for nearby or
local copies of all content. Thus, we might talk about
“asynchronization of multicast” and commercialization of peer-to-peer communication and content
sharing. Rich-value content creators would love to
eliminate any intermediaries while also pushing storage and its costs to edge users.
Technology push also plays a role in Internet-based
communications. Software has changed since the days
of layered system design; today, we sustain reliable
software built from well-contained components
assembled with wrappers designed to enforce behavior. How is this relevant to a future Internet architecture? For one thing, that architecture could be more
diversified, with less commonality in the nodes (hosts
and routers) than we have had for the past 20 years of
PCs, servers, and routers all coming from the same
technology stable.
This also fits my wave-particle model of how technology is changing within networks and protocols.
Recent papers, including [ 2], have proposed replacing
the layered protocol stack with a graph or even a heap
(pile) of soft protocol components. However, we can
also look at the network topology itself and see that
the role of nodes is changing within it. Perhaps all
nodes are the equivalent of middle boxes, revisiting
the old Internet idea that any component with more
than one network interface can be a router. We see it
in our end-user devices—in my case the Macintosh
iBook I typed this essay on and the Windows smart
phone in my pocket, each with at least three built-in
radios and a wired network interface. When we interconnect these devices, the network communication
topology is far more dynamic than any public network has been in the past.
Many of the increasingly heterogeneous “links”
connecting devices are also not well characterized as
“pipes”; indeed, the capacity of a volume of space containing a number of mobile devices is simply not
known; some physical bounds are known, but the
equivalent of a Shannon Limit in information-theory
terms is not. This lack of information argues that network architects need a temporal graph model of the
network. It also argues that today’s architectures are
unable to accommodate the resources model in a temporal graph. Simply labeling the edges of the graph
with weights to represent capacity or delay does not
capture the right information.
One more critical piece of technology—network
coding—further muddies the effort to devise a grand
unified network architecture that would maintain the
wave-particle duality analogy. Practical network coding in wireless and wired networks promises to
remove much of the complexity in resource management. Network coding involves the merging of packets (such as by XOR in the simplest form) transmitted
along common subpaths. Network coding can be
combined with redundancy for reliability.
So how might the wave-particle duality idea be
applied to a network? The Internet is already dominated by swarms of network-coded content, no
longer flowing from place to place but emanating like
ripples on a pond from multiple points, then becoming available for local consumption. Neal Stephenson
predicted this with remarkable prescience in his novel
The Diamond Age [ 3]. Publication of new content is
the start of a new wave. The content spreads through