to be clear that whenever you’re doing
some slicing, you’re cutting away some
of the system potential, which means
you may lose some test coverage.
That’s why this ends up being so challenging. As Nico was saying, however,
since the slicing is also closely coupled
with your test purposes, you still ought
to end up being able to cover all the requirements in your documentation.
KICILLoF: Yes, coupling to test purposes is key because if the slicing were
done just according to your use cases,
only the most common usage patterns
of the system might end up being tested. But that’s not the case here.
Also, throughout the tool chain, we
provide complete traceability between
the statements taken from the specification and the steps noted in a test log.
We have tools that can tell you whether the way you’ve decided to slice the
model leaves out any requirements you
were intending to test. Then at the end
you get a report that tells you whether
your slicing proved to be excessive or
By all accounts, the testing project has
been extremely successful in helping
ensure that Microsoft’s protocol documents are of sufficiently high quality to satisfy the company’s regulatory
obligations related to Windows Client
and Windows Server communications.
But the effort hasn’t stopped there,
as much the same approach has been
used to test the protocol documentation for Office, SharePoint Server, SQL
Server, and Exchange Server.
This work, done with the goal of
providing for interoperability with Mi-
crosoft’s high-volume products, was
well suited to the model-based test-
ing technology that was productized
to support the court-ordered protocol
documentation program. Because
projects can be scaled by dividing the
work into well-defined units with no
cross dependencies, the size of a test-
ing project is limited only by the num-
ber of available testers. Because of this
scalability, projects can also be com-
pleted efficiently, which bodes well for
the technology’s continued use within
Microsoft—and beyond. What’s more,
Microsoft’s protocol documentation
testing effort appears to have had a
profound effect on the company’s over-
all worldview and engineering culture.
BInDER: Within Microsoft, do you see a
broader role for the sort of work you’re
doing? Or does it pretty much just begin
and end with compliance to the court
KICILLoF: It goes beyond the decree.
Increasing the interoperability of our
products is a worthy goal in and of itself.
We’re obviously in a world of heterogeneous technology where customers expect products to interoperate.
That’s also changing the way products are developed. In fact, one of our
goals is to improve the way protocols are
created inside Microsoft. That involves
the way we design protocols, the way
we document protocols such that third
parties can use them to talk to our products, and the way we check to make sure
our documentation is correct.
GRIESKAMP: One aspect of that has to
do with the recognition that a more systematic approach to protocol development is needed. For one thing, we currently spend a lot of money on quality
assurance, and the fact that we used to
create documentation for products after they had already been shipped has
much to do with that. So, right there
we had an opportunity to save a lot of
Specification or model-driven development is one possible approach for
optimizing all of this, and we’re already
looking into that. The idea is that from
each artifact of the development process you can derive documentation,
code stubs, and testable specifications
that are correct by definition. That way,
we won’t end up with all these different independently created artifacts that
then have to be pieced together after the
fact for testing purposes.
For model-based testing in particular, I think this project serves as a powerful proof point of the efficiencies and
economies that can be realized using
this technology. That’s because this is
by far the largest undertaking in an industrial setting where, within the same
project, both traditional testing methodologies and model-based testing
have been used. This has created a rare
opportunity to draw some side-by-side
comparisons of the two.
We have been carefully measuring
various metrics throughout, so we can
now show empirically how we man-
aged essentially to double our efficien-
cy by using model-based testing. The
ability to actually document that is a
really big deal.
Too Darned Big to Test
Comments are More Important than Code
Finding Usability Bugs
with Automated Tests
1. grieskamp, W., kicillof, n., MacDonald, D., nandan, a.,
stobie, k., Wurden, f., Zhang, D. Model-based quality
assurance of the sMB2 protocol documentation. In
Proceedings of the 8th International Conference on
Quality Software (2008).
2. grieskamp, W., kicillof, n., MacDonald, D., stobie, k.,
Wurden, f., nandan, a. Model-based quality assurance
of Windows protocol documentation. In Proceedings
of the 1st International Conference on Software
Testing, V & V (2008).
3. grieskamp, W., kicillof, n., stobie, k., Braberman,
V. Model-based quality assurance of protocol
documentation: tools and methodology. Journal of
Software Testing, Verification, Validation and Reliability
21 (Mar. 2011), 55–71.
4. stobie, k., kicillof, n., grieskamp, W. Discretizing
technical documentation for end-to-end traceability
tests. In Proceedings of the 2nd International
Conference on Advances in System Testing and
Validation Lifecycle (Best paper award, 2010).
© 2011 acM 0001-0782/11/07 $10.00