• Iterative testing of embodied
working prototypes
• Finding a minimum viable product
(MVP) and minimum viable data (MVD)
• Creating new AI technology
requirements.
To support the above affordances, AI
design tools need powerful capabilities
and workflows that mesh fluidly with
design processes. In the same way
that we use tools such as InVision and
Proto.io for the prototyping of screen
interactions, we need new tools for the
prototyping of autonomous systems and
their interactions.
THE DELFT AI TOOLKIT
Over the past year, I’ve been developing
the Delft AI Toolkit (https://github.
com/pvanallen/delft-toolkit-v2) as
an experiment in prototyping ways
to prototype AI. It is an open source
visual authoring system that integrates
AI techniques and technologies into an
easy-to-use drag-and-drop environment
that strives toward the needs and
methods outlined above.
This toolkit builds on my prior
research in toolkits with NTK
( netlabtoolkit.org) and with an approach
to AI I call animistic design. This work
helped inform both what to do and what
mistakes to avoid.
NTK. Inspired by data-flow tools
like MAX/MSP, NTK (Figure 3) is a
drag-and-drop authoring system for
designing projects with sensors and
actuators. Using microcontrollers
like the Arduino, NTK helps the
designer create working prototypes
with easy-to-use visual widgets
that encapsulate domain expertise.
should enable us to develop a facility
with AI as a material for design
projects. Designers need to develop
a strong tacit understanding of AI
through tinkering, exploring, and
building. This demystification and
understanding can lead to new
collaborations, design concepts, and
methods for AI through the unique
engagement that designers have with
their material. Designers bring an
important positionality and value
system that needs a place at the table in
creating new directions for AI itself.
While there are interesting new tools
being introduced for the development of
machine-learning models (e.g., lobe.ai),
there is little that specifically targets the
design of the interactions and behaviors
that compose the human experience
around the AI models. The character of
AI interactions is complex and different,
arising from independent, evolving
systems that need ongoing tending and
maintenance by end users. How will
each new application of AI work, and
what are the human outcomes?
To go beyond the clichés of AI and
create consequential, fulfilling, and
ethically sound new interactions,
designers need productive ways to craft
a refined aesthetic for the design of the
behaviors, sociality, and narratives that
AI systems need.
Such tools must address the
prototyping and design of:
• Personality and character for
autonomous behavior (e.g., Figure 2)
• Multimodal, non-visual interactions
• User training/pruning/tending/
learning
• POV and biases (considering
diversity in people and machines)
• Mixed social interactions—M2H
and M2M
• Intentions/goals/rules
• Ethics/civic responsibility
• Indication of expertise and
affordance.
And the tools must enable the
designer to use methods such as:
• Fast, iterative, experimental
prototyping of interactions and
autonomous behaviors
• Wizard of Oz ( WOZ) design
experiments and testing with people
• Comparing different algorithms,
datasets, and training methods
Figure 1. The Delft AI Toolkit running on a laptop, a robot device that works
with the toolkit, and a tablet used to marionette behavior of the device as a
Wizard of Oz technique. In this photo, the robot is interacting in a way defined
by the visual authoring system, mirroring what the on-screen 3D model of the
robot is doing.
PERSONALITY DESIGN
Autonomy
Smartness
Serendipity
Provocation
Mood
Learning
PERSONALITY DESIGN
Autonomy
Smartness
Serendipity
Provocation
Mood
Learning Figure 2. This diagram identifies possible functionality in the
toolkit for adjusting the personality of the AI device through
a visual interface. This example list is not exhaustive, but it
does illustrate the kinds of high-level controls that would help
designers.
Figure 3. The interface for the author’s NTK visual programming toolkit for working with
Arduino, sensors, actuators, and media ( netlabtoolkit.org).