ity to deliver dependable and usable
software. Fritz Bauer, one of the field’s
founders, believed a rigorous engineering approach was needed. He famously
quipped, “Software engineering is the
part of computer science that is too hard
for computer scientists.” Over the years,
software engineers produced many
powerful tools: languages, module managers, version trackers, visualizers, and
debuggers are some examples. In his
famous “No silver bullet” assessment
(1986), Fred Brooks concluded that the
software crisis had not abated despite
huge advancements in tools and methods; the real problem was getting an
intellectual grasp of the problem and
translating that understanding into an
appropriate system architecture.
2 The
tools of 1986, while better than those
of 1968, relied on concepts that did not
scale up to ever-larger systems. The situation today is much the same: tools are
more powerful, but we struggle with
scalability, usability, and predictability.
Current software engineering is
based on four key assumptions:
˲ Dependable large systems can only
be attained through rigorous application of the engineering design process
(requirements, specifications, prototypes, testing, acceptance).
˲ The key design objective is an architecture that meets specifications
derived from knowable and collectable
requirements.
˲ Individuals of sufficient talent and
experience can achieve an intellectual
grasp of the system.
˲ The implementation can be completed before the environment changes
very much.
What if these assumptions no longer
the astonishing
success of evolutionary
development
challenges our
common sense about
developing large
systems.
hold? The first assumption is challenged
by the failures of large systems that used
the traditional design process and the
successes of other large systems that
simply evolved. The remaining assumptions are challenged by the increasingly
dynamic environments, often called
ecosystems, in which large systems operate. There is no complete statement
of requirements because no one person,
or even small group, can have complete
knowledge of the whole system or can
fully anticipate how the community’s requirements will evolve.
system evolution:
a new common sense
To avoid obsolescence, therefore, a system should undergo continual adaptation to the environment. There are two
main alternatives for creating such adaptations. The first, successive releases
of a system, is the familiar process of
software product releases. It can work
in a dynamic environment only when
the release cycle is very short, a difficult
objective under a carefully prescribed
and tightly managed process. Windows
Vista, advertised as an incremental improvement over XP, was delivered years
late and with many bugs.
The second approach to adaptation is
many systems competing by mimicking
natural evolution; the more fit systems
live on and the less fit die out. Linux,
the Internet, and the World Wide Web
illustrate this with a constant churn of
experimental modules and subsystems,
the best of which are widely adopted.
Evolutionary system design can become a new common sense that could
enable us to build large critical systems
successfully. Evolutionary approaches
deliver value incrementally. They continually refine earlier successes to deliver
more value. The chain of increasing value sustains successful systems through
multiple short generations.
Designs by Bureaucratic
organizations
Fred Brooks observed that software
tends to resemble the organization that
built it. Bureaucratic organizations tend
toward detailed processes constrained
by many rules. The U.S. government’s
standard acquisition practices, based on
careful preplanning and risk avoidance,
fit this paradigm. Their elaborate architectures and lengthy implementation
cycles cannot keep up with real, dynamic
environments.
It may come as a surprise, therefore, that practices for adaptability are
allowed under government acquisition rules. In 2004, the Office of Secretary of Defense sponsored the launch
of W2COG, the World Wide Consortium for the Grid ( w2cog.org) to help
advance networking technology for
defense using open-development processes such as in the World Wide Web
Consortium ( w3c.org). The W2COG
took advantage of a provision of acquisition regulations that allows Limited Technology Experiments (LTEs).
The W2COG recently completed
an experiment to develop a secure
service-oriented architecture system,
comparing an LTE using evolutionary
methods against a standard acquisition process. Both received the same
government-furnished software for
an initial baseline. Eighteen months
later, the LTE’s process delivered a
prototype open architecture that addressed 80% of the government requirements, at a cost of $100K, with
all embedded software current, and
a plan to transition to full COTS software within six months.
In contrast, after 18 months, the
standard process delivered only a concept document that did not provide a
functional architecture, had no working
prototype, deployment plan, or time-line, and cost $1.5M. The agile method
produced a “good enough” immediately
usable 80% success for 1/15 the cost of
the standard method, which seemed
embarked on the typically long road to
disappointment.
agile methods for Large systems
Agile system development methods
have been emerging for a decade.
1, 3, 6
These methods replace the drawn-out
preplanning of detailed specifications
with a fast, cyclic process of prototyping
and customer interaction. The evolutionary design approach advocated here
is a type of agile process.
The U.S. Government Accounting Office (GAO) has scolded the government
on several occasions for its uncommitted lip service to agile processes.
4 The
GAO believes agile processes could significantly shorten time to delivery, reduce failure rate, and lower costs. Many
people resist the GAO advice because