Foundation under grants IIS-1422066,
CCF-1442840, IIS-1717473, and IIS-
1. Allen, R.B. Mental models and user models. Chapter
in Handbook of Human-Computer Interaction, Second
Edition, M.G. Helander, T.K. Landauer, and P. V. Prabhu,
Eds. North-Holland, Amsterdam, the Netherlands,
2. Argyris, C. The executive mind and double-loop
learning. Organizational dynamics 11, 2 (Autumn
3. Argyris, C. Teaching smart people how to learn.
Harvard Business Review 69, 3 (May-June 1991).
4. Argyris, C. Double-loop learning. Chapter in Wiley
Encyclopedia of Management, C.L. Cooper, P.C. Flood,
and Y. Freeney, Eds. John Wiley & Sons, Inc., New
5. Austin, R.D. and Devin, L. Artful Making: What
Managers Need to Know About How Artists Work.
Financial Times Press, Upper Saddle River, NJ, 2003.
6. Brown, C. and Linden, G. Chips and Change: How Crisis
Reshapes the Semiconductor Industry. MI T Press,
Cambridge, MA, 2009.
7. Hendrikx, M., Meijer, S., Van Der Velden, J., and Iosup,
A. Procedural content generation for games: A
survey. ACM Transactions on Multimedia Computing,
Communications, and Applications 9, 1 (Feb. 2013), 1.
8. Jaderberg, M., Mnih, V., Czarnecki, W.M., Schaul,
T., Leibo, J.Z., Silver, D., and Kavukcuoglu, K.
Reinforcement learning with unsupervised auxiliary
tasks. In Proceedings of the Fifth International
Conference on Learning Representations ( Toulon,
France, Apr. 24–26, 2017).
9. Lake, B., Ullman, T., Tenenbaum, J., and Gershman,
S. Building machines that learn and think like people.
Behavioral and Brain Sciences 40, E253 (2017).
10. Mnih, V., Kavukcuoglu, K., Silver, D., Rusu, A. A.,
Veness, J., Bellemare, M.G., Graves, A., Riedmiller, M.,
Fidjeland, A.K., Ostrovski, G., and Petersen, S. Human-level control through deep reinforcement learning.
Nature 518, 7540 (Feb. 2015), 529–533.
11. Sennet, R. The Craftsman. Allen Lane, London, U.K., 2008.
12. Werle, G. and Martinez, B. Ghost Recon Wildlands:
Terrain tools and technologies. Game Developers
Conference (San Francisco, CA, Feb. 27–Mar.
3, 2017); https://666uille.files.wordpress.
13. Yumer, M.E., Asente, P., Mech, R., and Kara, L. B.
Procedural modeling using autoencoder networks. In
Proceedings of the 28th Annual ACM Symposium on
User Interface Software & Technology (Charlotte, NC,
Nov. 11–15). ACM Press, New York, 2015, 109–118.
Stefan Seidel ( firstname.lastname@example.org) is a professor and
the Chair of Information Systems and Innovation at the
Institute of Information Systems at the University of
Liechtenstein, Vaduz, Liechtenstein.
Nicholas Berente ( email@example.com) is an associate
professor of IT, analytics, and operations in the Mendoza
College of Business at the University of Notre Dame,
Notre Dame, IN, USA.
Aron Lindberg ( firstname.lastname@example.org) is an assistant
professor of information systems in the School of
Business of Stevens Institute of Technology, Hoboken,
Kalle Lyytinen ( email@example.com) is a Distinguished
University Professor and Iris S. Wolstein Professor of
Management Design at Case Western Reserve University,
Cleveland, OH, USA.
Jeffrey V. Nickerson ( firstname.lastname@example.org) is a
professor of information systems and the Associate Dean
of Research of the School of Business at Stevens Institute
of Technology, Hoboken, NJ, USA.
Copyright held by authors.
outputs in a way that leads to new hypotheses with regard to what sets of
input parameters should be tested in
the next batch of experiments.
Adjustment. Evaluation by human
designers can lead to the adjustment
of parameter values (see Figure 2, loop
1) or even to changes in the mental
model embedded in the autonomous
tool, resulting in changes in the algorithms used; moreover, it might also
change the mental models of human
designers in terms of goals, cognitive rules, and underlying reasoning.
Changes of the mental model embedded in the autonomous tool could
change the tool’s constraints and propensities and require changes to the
mental models of designers; likewise,
changes in the mental models of designers could require changes to the
algorithms and thus the mental model
embedded in the tool. Following each
experiment, designers might thus
have to continuously reconcile their
mental models with the counterpart
models embedded in the autonomous
tool (see Figure 2, loop 3).
In order to change the mental model
embedded in an autonomous tool, designers have to modify the underlying
algorithm. The original mental model
embedded in the tool—the one implemented by the tool designer—can thus
evolve over time.
Competencies related to these design practices become critically important for achieving complex design
outcomes. Having a detailed understanding of the designed artifact, as
well as of the consequences of specific
local decisions, becomes less important. This explains why, in the context
of, say, chip design, we see software
engineers displacing electrical engineers with a deep understanding of
physical aspects of chip design. Because the design is increasingly mediated by software that needs to be pa-rameterized and evaluated, designers’
software skills become crucial; the
table here outlines key implications
in terms of emergent interrelated designer activities.
Some substitution of human de-
The Road Ahead
sign activity through autonomous
tools is indeed occurring. To a cer-
tain degree, demand for specific,
manual-type competencies in design
professions, including software de-
velopment, is decreasing, while the
demand for skills focused on how to
work with software tools is increas-
ing. Organizations need to engage
more effectively with new forms of
autonomous tools supporting design
processes. This is not simply a shift
of tasks from humans to machines
but a deeper shift in the relationship
between humans and machines in
the context of complex knowledge
work. The shift puts humans in the
role of coaches who guide tools to
perform according to their expecta-
tions and requirements (see Figure
2, loop 1) or in the role of laboratory
scientists conducting experiments to
understand and modify the behavior
of complex knowledge artifacts (see
Figure 2, loop 2 and loop 3).
Engaging with autonomous tools requires reshaping the competencies
designers need. Designers envision
certain results and thus need to interact with autonomous tools in ways that
help them realize their design vision.
At the same time, the use of autonomous tools opens unprecedented
opportunities for creative problem
solving. Consider the example of video game production, where autonomous tools are increasingly able to
procedurally generate artifacts of a
scope and scale that was not possible
in the past. Future designers will constantly be challenged to rethink their
mental models, including their general approach to design. The continuous reconciliation of mental models
embedded in both designer cognition and their tools is an extension
of traditional design processes that
involve artful making where human
actors gradually adjust their mental
models to converge on solutions. 5
The proposed three-loop model
contributes to the ongoing debate on
how artificial intelligence will change
knowledge work, challenging knowledge workers to operate at a different
level. Designers may become increasingly removed from the actual artifact
but still use tools to create artifacts of a
complexity never imagined before.
This material is based in part on work
supported by the National Science