verify was also a difficult problem. The
requirements for the EB are actually
specified for the combination of the
EB and the aircraft model, but checking both the EB and the aircraft model
exceeded the capabilities of the Prover
Plug-In model checker. After consultation with the OFP designers, the verification team decided to verify whether
the six actuator commands would always be within dynamically computed
upper and lower limits. Violation of
these properties would indicate a design error in the EB logic.
Even with these adjustments, the
EB model was large enough that it had
to be decomposed into a hierarchy of
components several levels deep. The
leaf nodes of this hierarchy were then
verified using Prover Plug-In and their
composition was manually verified
using manual proofs. This approach
also ensured that unsoundness could
not be introduced through circular
reasoning since Simulink enforces
the absence of cyclic dependencies
between atomic subsystems.
Ultimately, five errors in the EB design logic were discovered and corrected through model checking of these
properties. In addition, several potential errors that were being masked by
defensive design practices were found
and corrected.
Lessons from the case studies
The case studies described here demonstrate that model checking can be
effectively used to find errors early in
the development process for many
classes of models. In particular, even
very complex models can be verified
with BDD-based model checkers if
they consist primarily of Boolean and
enumerated types. Every industrial
system we have studied contains large
sections that either meet this constraint or can be made to meet it with
some alteration.
For this class of models, the tools
are simple enough for developers to use
them routinely and without extensive
training. In our experience, a single day
of training and a low level of ongoing
mentoring are usually sufficient. This
also makes it practical to perform model checking early in the development
process while a model is still changing.
Running a set of properties after
each model revision is a quick and
easy way to see if anything has been
broken. We encourage our developers
to “check your models early and check
them often.” The time spent model
checking is recovered several times
over by avoiding rework during unit
and integration testing.
future Directions
There are many directions for further
research. As illustrated in the CerTA
FCS Phase II study, numerically intensive models still pose a challenge for
model checking. SMT-based model
checkers hold promise for verification
of these systems, but the need to write
properties that can be verified through
induction over the state transition relation make them more difficult for developers to use.
Most industrial models used to
generate code make extensive use of
floating point numbers. Other models,
particularly those that deal with spatial
relationships such as navigation, make
extensive use of trigonometric and other
transcendental functions. A sound and
efficient way of checking systems using
floating point arithmetic and transcendental functions would be very helpful.
It can also be difficult to determine
how many properties must be checked.
Our experience has been that checking
even a few properties will find errors,
but that checking more properties
will find more errors. Unlike testing
for which many objective coverage criteria have been developed, completeness criteria for properties do not seem
to exist. Techniques for developing
or measuring the adequacy of a set of
properties are needed.
As discussed in the Cer TA FCS Phase
II case study, the verification of very large
models may be achieved by using model
checking on subsystems and more traditional reasoning to compose the subsystems. Combining model checking
and theorem proving in this way could
be a very effective approach to the compositional verification of large systems.
Acknowledgments
The authors wish to thank Ricky Butler,
Celeste Bellcastro, and Kelly Hayhurst
of the NASA Langley Research Center,
Mats Heimdahl, Yunja Choi, Anjali
Joshi, Ajitha Rajan, Sanjai Rayadur-gam, and Jeff Thompson of the University of Minnesota, Eric Danielson, John
Innis, Ray Kamin, David Lempia, Alan
Tribble, Lucas Wagner, and Matt Wilding of Rockwell Collins, Bruce Krogh of
CMU, Vince Crum, Wendy Chou, Ray
Bortner, and David Homan of the Air
Force Research Lab, and Greg Tallant
and Walter Storm of Lockheed Martin
for their contributions and support.
This work was supported in part
by the NASA Langley Research Center
under contract NCC-01001 of the Aviation Safety Program (AvSP) and by the
Air Force Research Lab under contract
FA8650-05-C-3564 of the Certification Technologies for Advanced Flight
Control Systems program (Cer TA FCS)
[88ABW-2009-2730].
References
1. clarke, E., grumberg, o. and Peled, D. Model Checking.
the Mit Press, cambridge, Ma, 2001.
2. Esterel technologies. ScaDE Suite Product
Description; http://www.estereltechnolgies.com/
3. halbwachs, N., caspi, P., raymond, P and Pilaud, D.
the synchronous dataflow programming language
Lustre. in Proceedings of the IEEE 79, 9 (1991) 1305–
1320.
4. holzmann, g. The SPIN Model Checker: Primer and
Reference Manual. addison-Wesley Professional, 2003.
5. irSt. the NuSMV Model checker; http://nusmv.irst.itc.it/
6. the Mathworks Simulink Product Description; http://
www.mathworks.com/
7. Miller, S., anderson, E., Wagner, L., Whalen, M. and
heimdahl, M.P.E. Formal verification of flight critical
software. in Proceedings of the AIAA Guidance,
Navigation and Control Conference and Exhibit (San
Francisco, ca, aug. 15–18, 2005).
8. Miller, S., tribble, a., Whalen, M. and heimdahl, M.P.E.
Proving the Shalls. International Journal on Software
Tools for Technology Transfer (Feb. 2006).
9. Prover technology. Prover Plug-in Product
Description; http://www.prover.com/
10. reactive Systems, inc.; http://www.reactive-systems.com/
11. Sri international. Symbolic analysis Laboratory;
http://sal.csl.sri.com/
12. Whalen, M., cofer, D., Miller, S. krogh, b., and Storm,
W. integration of formal analysis into a model-based
software development process. in Proceedings of the 12th
International Workshop on Formal Methods for Industrial
Critical Systems (berlin, germany, July 1–2, 2007).
13. Whalen, M., innis, J., Miller, S. and Wagner, L. aDgS-
2100 adaptive Display & guidance System Window
Manager analysis. NaSa contractor report cr-2006-
213952 (Feb. 2006); http://shemesh.larc.nasa.gov/fm/
fm-collins-pubs.html/
Steven P. Miller ( spmiller@rockwellcollins.com) is a
princial software engineer in the advanced technology
center at rockwell collins, cedar rapids, ia.
Michael W. Whalen ( whalen@cs.umn.edu) is the
program director at the University of Minnesota Software
Engineering center, Minneapolis, MN.
Darren D. Cofer ( ddcofer@rockwellcollins.com) is a
principal systems engineer in the advanced technology
center at rockwell collins, cedar rapids, ia.