tation, it was the introduction of the
Burroughs B5000 in 1961 that captured
the idea that ultimately proved to be
the way forward: disjoint CPUs concurrently executing different instruction
streams, but sharing a common memory. In this regard (as in many) the B5000
was at least a decade ahead of its time.
But it was not until the 1980s that the
need for multiprocessing became clear
to a wider body of researchers, who over
the course of the decade explored cache
coherence protocols (for example, the
Xerox Dragon and DEC Firefly), prototyped parallel operating systems (for
example, multiprocessor Unix running
on the AT&T 3B20A), and developed par-
allel databases (for example, Gamma at
the University of Wisconsin).
In the 1990s, the seeds planted by researchers in the 1980s bore the fruit of
practical, shipping systems, with many
computer companies (for example, Sun,
SGI, Sequent, Pyramid) placing big bets
on symmetric multiprocessing. These
bets on concurrent hardware necessitated corresponding bets on concurrent
software: if an operating system cannot
execute in parallel, not much else in the
system can either. These companies
came to the realization (independently)
that their operating systems must be rewritten around the notion of concurrent
execution. These rewrites took place in
the early 1990s and the resulting systems were polished over the decade. In
fact, much of the resulting technology
can today be seen in open source operating systems like OpenSolaris, FreeBSD, and Linux.
Just as several computer companies
made big bets around multiprocessing,
several database vendors made bets
around highly parallel relational databases; upstarts like Oracle, Teradata,
Tandem, Sybase and Informix needed
to use concurrency to achieve a performance advantage over the mainframes
that had dominated transaction processing until that time. 5 As in operating
systems, this work was conceived in the