experts from the Lavi avionics team, notably Akiva Kaspi and Yigal Livne.
An avionics system is a wonderful example of what my colleague at
Weizmann Amir Pnueli and I later
identified as a reactive system. The
17
main behavior that dominates such
a system is its reactivity, that is, its
event-driven, control-driven, event-response nature. The behavior is often highly parallel and includes strict
time constraints and possibly stochastic and continuous behavior. A typical
reactive system is not predominantly
data-intensive or algorithmic in nature. Behavior is the crucial problem
in its development—the need to provide a clear yet precise description of
what the system does or should do
over time in response to both external
and internal events.
The Lavi avionics team consisted of
extremely talented people, including
those involved in radar, flight control,
electronic warfare, hardware, communication, and software. The radar
people could provide the precise algorithm used to measure the distance to
a target. The flight-control people could
talk about synchronizing the controls in
the cockpit with the flaps on the wings.
The communications people could talk
about formatting information traveling
through the MuxBus communication
line. Each had his own idiosyncratic
ways of thinking about and explaining
the system, as well as his own diagrams
and emphases.
I would ask seemingly simple questions, such as: “What happens when
figure 1: Page from my early iai notes (1983). statechart constructs include hyper-edges,
nested orthogonality (a kind of concurrency), and transitions that reset a collection of
states (chart on right). note the use of cartesian products of sets of states (set-theoretic
formulation at the top) to capture the meaning of the orthogonality and the straightforward
algebraic notation for transitions between state vectors (lower right).
this button is pressed?” In response, a
weighty two-volume document would
be brought out and volume A would be
opened to page 389, clause 6. 11. 6. 10,
which says that if you press that button
such then such a thing would occur. At
which point (having by then learned
some of the system’s buzz words) I would
say: “Yes, but is that true even when an
infrared missile is locked on a ground
target?” To which someone might say,
“Oh no, in volume B, page 895, clause
19. 12. 3. 7, it says that in such a case
this other thing happens.” These Q&A
sessions would continue, and when it
would get to the fifth or sixth question
the engineers were no longer sure of
the answer and would have to call the
customer (the Air Force people) for a response. By the time we got to the eighth
or ninth question even the customer
didn’t have an answer.
Obviously, someone would eventually have to decide what happens when you
press a certain button under a certain
set of circumstances. However, this person might turn out to be a low-level programmer assigned to write some remote
part of the code, inadvertently making
decisions that influenced crucial behavior on a much higher level. Coming, as
I did, from a clean-slate background in
terms of avionics (a polite way of saying
I knew nothing about the subject), this
was shocking. It seemed extraordinary
that such a talented and professional
team knew in detail the algorithm used
to measure the distance to a target but
not many of the far more basic behavioral facts involving the system’s response
to a simple event.
To illustrate, consider the following
three occurrences of a tiny piece of behavior buried in three totally different
locations in a large specification of a
chemical manufacturing plant:
“If the system sends a signal hot then
send a message to the operator”;
“If the system sends a signal hot with
T > 60° then send a message to the operator”; and
“When the temperature is maximum,
the system should display a message on
the screen, unless no operator is on the
site except when T < 60°.”
Despite my formal education in
mathematical logic, I’ve never been
able to understand the third item. Sarcasm aside, the real problem is that all
three were obviously written by three