Evaluation and usability as a practice area has diversified its approaches, broadened the spectrum of
UX issues it addresses, and extended its contribution into deeper levels of product-development decision
making. This forum addresses conceptual, methodological, and professional issues that arise in the
field’s continuing effort to contribute robust information about users to product planning and design.
David Siegel and Susan Dray, Editors
In Defense of Doing It
the Hard Way
Leanna Gingras
ITHAKA | leanna.gingras@ithaka.org
failed experiments, I am starting
to think that these gimmicks and
borrowed techniques from other
fields amount to shortcuts, and
shortcuts are not exactly formulas
for success. Worse, I’m concerned
that the quality of our work as
a whole suffers: Every time we
cut corners, we deliver subpar
work that waters down the value
that user research can offer.
We might not intend to skimp
on our work, or we might feel pressured to cut corners in our quest to
deliver more work more quickly, but
no matter how you slice it, shortcuts
aren’t actually doing us any favors.
Shortcuts don’t help us produce
good work, and if we strive to produce good work, shortcuts don’t
actually save time. We have to do it
the hard way.
March + April 2012
interactions
The job of a user-research professional is undoubtedly a hard one.
Understanding problems, getting the
right sample of people in our labs,
extracting insights from data, and
evangelizing the user’s needs can
make for challenging work. At the
same time, rewards abound in this
profession: the joy of diving into a
new topic, engrossing conversations
with some of the hundreds of people
who pass through my lab, and of
course the aha moments—those
glimmers of awesomeness alone
more than make up for any difficulties. But every now and then I wish
it were all a little...easier.
In the heat of the workweek, I’ve
been tempted by quick fixes and
shortcuts. A glance at the battlefield
of user research tells me I’m not
alone. It seems as if every week I
read about some paradigm-shatter-ing new tool that promises to blow
my mind, crunch all of my data by
5 o’clock, and have dinner on the
table by 7. Tools like these are often
pitched to us, an eager audience of
open-minded, tired, bored, inexperienced, or budget-starved user-experience evaluators.
These promises are rarely fulfilled. I still end up spending hours
hunched over my computer, or I
don’t get the insights I was hoping for, or the quality of my work
just plain suffers. After many
What Do Shortcuts and
Cut Corners Look Like?
Let’s get something out of the way
first: When picking on shortcuts, I’m
not targeting appropriate guerrilla
user research. I have no issue with
designers or one-man bands who
just want to know how to improve
their products. They don’t need to
do it the hard way, and when they
are ready to do it the hard way, they
will approach their work differently, either by learning new skills
or bringing in a seasoned researcher.
Rather, this discussion is intended
for people whose primary
focus is user research, day in and
day out, whose job is to learn more
about users and to understand
their context. Solid user research
requires both sweat and diligent
work. Whether we are in the lab
running usability studies or out in
the field conducting ethnographic
research, our core value as user-research professionals lies in our
deep understanding of context, our
analytical skills, and our ability to
bring empiricism into the product-development process.
To put it another way, user-research professionals get hired not
just because we are good at excavating truth, but also because we have
a knack for mapping the knowns
and unknowns around those truths,
finding new points to investigate,
and communicating the core truths
that we learn in a way that’s helpful and productive. When we do
that, we can help a design plow all
the way to the other end of development, through shifting requirements and slippery scopes, without
ever losing focus on the needs that
the design was built to address.
It’s our job to ensure that rigor
backs our process, and that we are
actually being as precise in our
measurements as we think we are.