establish a “totally integrated management information system.” 8 This
would integrate and automate all the
core operations of a business, ideally
with advanced management reporting and simulation capabilities built
right in. The latest and most expensive
computers of the era had new capabilities that seemed to open the door to a
more aggressive approach. Compared
to the machines of the 1950s they
had relatively large memories. They
featured disk storage as well as tape
drives, could process data more rapidly, and some were even used to drive
interactive terminals.
The reality of data processing
changed much more slowly than the
hype, and remained focused on simple
administrative applications that batch
processed large files to accomplish
tasks such as weekly payroll processing, customer statement generation,
or accounts payable reporting.
Many companies announced their
intention to build totally integrated
management information systems,
but few ever claimed significant success. A modern reader would not be
shocked to learn that firms were unable to create systems of comparable
scope to today’s Enterprise Resources
Planning and data warehouse projects using computers with perhaps the
equivalent of 64KB of memory, no real
operating system, and a few megabytes
of disk storage. Still, even partially integrated systems covering significant
portions of a business would have real
value. The biggest roadblocks to even
modest progress toward this goal were
the sharing of data between applications and the difficulties application
programmers faced in exploiting random access disk storage.
Getting a complex job done might
involve dozens of small programs and
the generation of many working tapes
full of intermediate data. These banks
of whirring tape drives provided com-
puter centers with their main source
of visual interest in the movies of the
era. Tape-based processing techniques
evolved directly from those used with
pre-computer mechanical punched
card machines: files, records, fields,
keys, grouping, merging data from
two files, and the hierarchical combi-
nation of master and detail records
within a single file. These applied to
database history has been largely ne-
glected. For example, the index of Isaa-
cson’s book does not include entries
for “database” or for any of the four
people to have won Turing Awards in
this area: Charles W. Bachman and Ed-
gar F. Codd (1981), James Gray (1988),
or Michael Stonebraker (2014).
That’s a shame, because if any technology was essential to the rebuilding of our daily lives around digital
infrastructures, which I assume is
what Isaacson means by “the Digital
Revolution,” then it was the database
management system. Databases undergird the modern world of online
information systems and corporate
intranet applications. Few skills are
more essential for application developers than a basic familiarity with SQL,
the standard database query language,
and a database course is required for
most computer science and information systems degree programs. Within
ACM, SIGMOD—the Special Interest
Group for Management of Data—has
a long and active history fostering database research. Many IT professionals
center their entire careers on database
technology: the census bureau estimates the U.S. alone employed 120,000
database administrators in 2014 and
predicts faster than average growth for
this role.
Bachman’s IDS was years ahead of
its time, implementing capabilities
that had until then been talked about
but never accomplished. Detailed functional specifications for the system
were complete by January 1962, and
Bachman was presenting details of the
planned system to his team’s in-house
customers by May of that year. It is less
clear from archival materials when the
system first ran, but Bachman tells me
that a prototype installation of IDS was
tested with real data in the summer of
1963, running twice as fast as a custom-built manufacturing control system
performing the same tasks.
The details of IDS, Bachman’s life
story, and the context in which it arose
have been explored elsewhere. 2, 6 In this
column, I focus on two specific questions:
˲ Why do we view IDS as the first database management system, and
˲ What were its similarities and differences versus later systems?
There will always be an element
of subjectivity in judgments about
“firsts,” particularly as IDS predated
the concept of a database management
system. As a fusty historian I value nu-
ance and am skeptical of the idea that
any important innovation can be fully
understood by focusing on a single
breakthrough moment. I have docu-
mented many ways in which IDS built
on earlier file management and report
generation systems. 7 However, if any
system deserves the title of “first data-
base management system” then it is
clearly IDS. It became a model for the
earliest definitions of “data base man-
agement system” and included most of
the core capabilities later associated
with the concept.
What Was IDS For?
Bachman created IDS as a practical
tool, not an academic research project.
In 1963 there was no database research
community. Computer science was just
beginning to emerge as an academic
field, but its early stars focused on programming language design, theory of
computation, numerical analysis, and
operating system design. In contrast
to this academic neglect, the efficient
and flexible handling of large collections of structured data was the central
challenge for what we would now call
corporate information systems departments, and was then called business
data processing.
During the early 1960s the hype and
reality of business computing diverged
dramatically. Consultants, visionaries,
business school professors, and computer salespeople had all agreed that
the best way to achieve real economic
payback from computerization was to
If any technology
was essential to the
rebuilding of our daily
lives around digital
infrastructures,
it was the database
management system.