Unfortunately, in our
line of work there are many
opportunities to deceive ourselves
into thinking we can save time,
energy, or money without sacrific-
ing the precision and accuracy of
our work. Anything that doesn’t
require much sweat, plodding,
or careful attention to detail is
a shortcut, whatever form that
shortcut may take. Sometimes
a shortcut promises to reduce
the amount of time we spend
planning and executing studies.
Other times a shortcut claims
to make analyzing data easier.
Still other times, a shortcut takes
the form of a misapplied tool.
The Shotgun Shortcut: Executing
Studies Ineffectively
Generally, the greater the amount
of time that a shortcut claims to
save me, the more suspicious I am
of the shortcut. A case in point: I’m
highly suspicious of unmoderated
open card sorts, in which remote
participants are given a heap of
cards and asked to sort them into
categories and label the categories.
Hosting card
sorts online saves
time and resources by allowing users to complete them at
their leisure and without need for
professional attention. However,
this adaption comes at a cost. It
sacrifices the main benefit of that
particular research methodology,
namely access to the rich, qualitative verbal report from our participants that helps us understand the
way in which they construe the
world. With this understanding,
we can address the why of things.
Unmoderated card sorts cannot
give you this why; they can give
you only the what and the when.
One rebuttal is that if we run a
large enough sample, we can call
it quantitative data. But this is still
qualitative data; in adapting this
methodology poorly, we lose its
intent and its strength. When we
mechanize it and remove that abil-
ity to follow a participant’s train of
thought, we shove all that beautiful
qualitative data into a quantita-
tive box. The
result is a monster
pile of data, stripped of context
and of any good foothold from
which we can understand what
these categories and labels mean
to the user and their work.