they would have quietly put up with the
limitations of off-the-shelf software!
Notwithstanding the unique capabilities of ATA that differentiated it from
competitor product offerings, users
were not averse to comparing ATA with
these other tools in terms of feature
completeness. Addressing this partly
involved managing user expectations
of a tool that was, in essence, a research
prototype rather than a product.
One of the recurring issues in client
acceptance was that of tool usability.
ATA was, by design, a tool for non- or
semi-expert users. As such, we had to
pay significant attention to making the
tool behave well under all sorts of usage, sometimes even comically inept
ones. Failure to anticipate such tool
abuses resulted in escalations, and
with it the risk of creating a bad image
for the technology.
The final lesson then is the “last
mile” is a deeply flawed metaphor when
used in the context of tech transfer of
software tools. In reality, a research
prototype is just the first mile; everything after that is the work needed to
make the technology work in real-world
scenarios, usable by the target audience, and perhaps most importantly,
to establish a positive value proposition for it. This requires patience and a
long-term commitment on the part of
researchers who wish to carry out a successful tech transfer.
1. Quality Center Enterprise, HP; http://www8.hp.com/
2. Thummalapenta, S., Devaki, P., Sinha, S., Chandra, S.,
Gnanasundaram, S., Nagaraj, D., and Sathishkumar,
S. Efficient and change-resilient test automation:
An industry case study. In Proceedings of the
International Conference on Software Engineering
(Software Engineering in Practice), 2013.
3. Thummalapenta, S., Sinha, S., Singhania, N., and
Chandra, S. Automating test automation. In
Proceedings of the International Conference on
Software Engineering, 2012.
Satish Chandra ( firstname.lastname@example.org) is Senior Principal
Engineer at Samsung Research America, Mountain View, CA.
Suresh Thummalapenta (suthumma@microsoft.
com) is a member of the Tools for Software Engineering
department at Microsoft Corporation, Redmond, WA.
Saurabh Sinha ( email@example.com) is a member of the
Programming Technologies department at the IBM T. J.
Watson Research Center, Yorktown Heights, N Y.
This Viewpoint is based on an invited talk Satish Chandra
presented at the 22nd ACM SIGSOFT International
Symposium on the Foundations of Software Engineering
(FSE 2014). The work described here was carried out at
IBM Research in Bangalore, India.
Copyright held by authors.
The obvious but crucial third lesson is your users, particularly the early
adopters, are precious and should be
treated as such, because their referral is
crucial in opening more doors.
Over time, we realized trying to
change the ways of existing projects
might be a fruitless initiative. It might
be better to approach the sales side
of the business, and get ATA worked
into the deal right from the start. This
turned out to be a good idea. Sales people liked flaunting the unique technology in test automation that IBM
Research had to offer. On our part, we
enjoyed getting to talk to higher-level
decision makers from client side—
these would often be executive-level
people in CIO teams of major corporations—as opposed to just delivery
managers on the IBM side. Once salespeople promised the use of ATA to the
clients, the delivery managers had no
choice but to comply. The result: better traction!
The fourth lesson then is that tech
transfer is subject to organizational
dynamics, and sometimes a top-down
approach might be more appropriate
than a bottom-up push.
Getting client teams interested
turned out to be only part of the battle.
There was significant work involved in
customizing ATA to suit a client’s needs.
Since the automation tool is only one
part of the overall workflow, we needed
to ensure ATA interoperates with any
third-party quality management infrastructure (such as Quality Center1) that
the client uses in their organization. We
also found ourselves under a lot of pressure due to the fact people could be vocal about their wish list from ATA, where
Summary of lessons for tech transfer.
Relevance Researchers should talk to their colleagues on
the development side frequently to identify opportunities where
a research insight can address a current problem.
Cost-benefit trade-offs Researchers should be prepared to propose and implement
measurements on both costs and benefits that could be
collected in an initial trial period.
Supporting early users Early adopters deserve extra consideration for the risk
they take, and for the proof points they will generate.
Organizational dynamics Decisions on technology adoption may not be based
purely on users’ perception of the technology in question.
“Last mile” Researchers who desire their work to be adopted in
production should be prepared to walk the long road from
a research prototype to a production-ready tool.
project would fall behind schedule.
Delivery managers are foremost responsible for predictable and consistent delivery, and were understandably
circumspect about adopting ATA in
their teams. Finally, their clients were
satisfied with the existing level of productivity, and there was no incentive to
change the status quo.
This brings us to the second important lesson: although we had established the technical feasibility of using
ATA in real projects, we had not made
a case that the benefits outweighed the
costs and the risks involved. Just because a research tool is available for
free does not mean people will adopt
it. People wanted to see a prior deployment as a comfort factor in being able
to defend adopting ATA, and we had
none to show. Moreover, we had no
data indicating the actual productivity
improvements when using ATA.
This chicken-and-egg problem found
its resolution due to a lucky coincidence. We came in contact with a team
located in the next building over from
us, in charge of test automation for an
internal website. This team was trying to
get through automation of about 7,000
tests, and they were falling behind.
Since they were desperate, and were
not under the confines of a client contract, they decided to try out ATA. This
allowed us to collect some citable data. 2
The actual data is not important here,
and possibly had caveats, but it corroborated the claims of higher productivity
as well as script resilience. We tried to
offer this team as good “customer service” as we possibly could, which came
in handy later when we asked them to
be our reference for others.