ification activity, minimizing upfront
activities in the face of non-functional
requirements, the incompleteness of
user stories (especially for secure systems), the need to align sprints and iteration rates with customers and regulators ability to accept deliveries, and
the (non-)availability of test hardware
for embedded systems.
˲ Agile assumptions: customer decision-making power and tempo, availability of plentiful test hardware, and
commercial and contractual models
needed to “procure Agile.”
˲ Opportunities: Adoption of formal
languages, automated synthesis, and
static verification as part of the deploy-
ment pipeline. Generalization of con-
tinuous integration into an “Evidence
We are deploying these ideas on fur-
ther projects, and look forward to be-
ing able to report the results. We hope
others will do the same.
1. Beck, K. Extreme Programming Explained: Embrace
Change. Addison Wesley, 1999.
2. Boehm, B. and Turner, R. Balancing Agility and Discipline:
A Guide for the Perplexed. Addison Wesley, 2003.
3. Chapman, R. and Amey, P. Static verification and
extreme programming. In Proceedings of the ACM
SIGAda Conference (2003).
4. Chapman, R. and Schanda, F. Are we there yet? 20
years of industrial theorem proving with SPARK. In
Proceedings of Interactive Theorem Proving 2014.
Springer LNCS Vol. 8558, (2014), 17–26.
5. Chenu, E. Agility and Lean for Avionics. Thales
Avionics, 2009; http://www.open-do.org/2009/05/07/
6. Hall, A. and Chapman, R. Correctness by construction:
Building a commercial secure system. IEEE Software
19, 1 (2002), 18–25.
7. Jackson, M. Problem Frames. Pearson, 2000.
8. Kanig, J. et al. Explicit assumptions—A prenup for
marrying static and dynamic program verification. In
Proceedings of Tests and Proofs 2014. Springer-Verlag
LNCS, 8570, (2014), 142–157; DOI: 10.1007/978-3-
9. Larman, C. and Basili, V. Iterative and incremental
development: A brief history. IEEE Computer, 2003.
10. Meyer, B. Agile! The Good, the Hype, and the Ugly.
11. Middleton, P. and Sutton, J. Lean Software Strategies.
Productivity Press, 2005.
12. Redmill, F. Software Projects: Evolutionary vs. Big-Bang Delivery. Wiley, 1997; http://www.safetycritical.
13. SINTEF. SafeScrum website, 2015; http://www.sintef.
Roderick Chapman ( email@example.com) is an
independent consultant software engineer, and an
honorary visiting professor at the University of York, U.K.
Neil White ( firstname.lastname@example.org) is Director of the
Intelligent Systems Expertise Centre of Altran U.K.
Jim Woodcock ( email@example.com) is Professor
of Software Engineering in the Department of Computer
Science at the University of York, U.K.
Thanks to Felix Redmill, Jon Davies, Mike Parsons, Harold
Thimbleby, and Communications’ reviewers for their
comments on earlier versions of this Viewpoint.
Copyright held by authors.
adopt, technologies, such as bounded
model checking. Computing power is
readily available to make these analyses tractable at an Agile tempo.
A second opportunity comes with
the realization that, if we can automate
analysis, building and testing of code,
why not automate the production of
other artifacts, such as synthesis of
code from formal models, traceability
analysis, and all the other documentation that might be required by a particular customer, regulator, or standard?
An example of such an “Evidence Engine” is shown in the accompanying
A crucial issue: How can we adopt an
Agile approach, and still estimate, bid,
win, and deliver projects at a reasonable profit? Our customers’ default approach is often to require a competitive bid at a fixed price, but how can
this be possible in an Agile fashion if
we are brave enough to admit that we
don’t know everything at the start of a
project? In most of our projects, the
users, procurers, and regulators are
distinct groups, all of whom may have
wildly different views of what “Agile”
We have had good experience with
a two-phase approach to contracting,
akin to the “architect/builder” model
for building a house. Phase 1 consists
of the “Upfront” work—requirements,
architectural design, and construction
of a skeletal satisfaction argument.
The “just enough” termination criteria
for phase 1 are:
˲ Convincing evidence that the arc
hitecture will work, meet non-functional requirements, and can accommodate foreseeable change.
˲ An estimate of the size (and therefore cost) of the remaining work, given
the currently understood scope.
Establishedgroundrulesforagree-ing the scope, size, and additional cost
of change requests, and commitment
to the tempo of decision making for
triage of changes and defects.
Phase 2 (possibly a competitive
bid) could be planned as an iterative
development, using the ideas set out
here. The MULTOS CA was delivered
in this fashion, with phase 1 on a time-and-materials basis, and phase 2 on a
capped-price risk-sharing contract.
High-Integrity Deployment Pipeline
We have used the ideas described in
this Viewpoint at Altran, but we have
yet to deploy all of them at once. An
idealized Agile development process
˲Principled requirements engineering,
7 concentrating initially on
non-functional requirements and development of architecture, specification, and associated satisfaction arguments.
˲A rolling formal specification,
with just enough formality to estimate
the remaining work and opening development iterations.
˲An evidence engine, combining
static verification, continuous regression testing, automated generation of
documents and assurance evidence,
and a cloud of virtualized target platforms for integration and deployment
˲ A planned, iterative development
style, starting with a partial-order
over system infrastructure and features that exposes potential for parallel development. Early iterations are
planned in detail, while the plans for
later iterations are left open to accommodate change.
Returning to the questions posed at
the beginning of this Viewpoint, we
could summarize our findings as follows:
˲ No clash: continuous integration,
verification-driven development style,
continuous regression testing, and an
explicitly planned iterative approach.
˲ Potential clash or inappropriate:
overdependence on test as the sole ver-
How can we adopt
an Agile approach,
and still estimate,
bid, win, and
at a reasonable