User feedback. We find it is critical
to involve users early on and conduct
qualitative interviews and surveys to
check their overall impressions of the
visualizations produced by our systems. Such feedback is essential for
identifying problems and ensuring
our design principles and the visualizations converge on effective designs.
The interviews and surveys provide
high-level checks of the effectiveness
of our design principles and allow us
to tweak the principles when not quite
right; for example, early on building
LineDrive, we asked users to rate handcrafted prototype route-map designs,
finding that 79 out of 90 respondents
preferred the distorted LineDrive prototypes to maps drawn to scale1 and
confirming that users thought the distorted maps were useful. Continual
feedback and evaluation yields more-effective algorithms and tools.
Another approach is to release the
visualization on the Web, then check
usage statistics; for example, at its
peak, LineDrive was serving more than
750,000 maps per day and became the
default route-mapping style for Map-Blast, an early Web-based provider of
route maps. Such public feedback is
a strong test of effectiveness, as ineffective solutions are quickly rejected.
We also recognize that usage statistics
are at best an indirect measure of effectiveness. Many excellent solutions
remain little-used due to a variety of
external forces that have little to do
with the usefulness or effectiveness of
a visualization.
User studies. To quantitatively assess the effectiveness of a visualization, we conduct user studies comparing visualizations created with our
design algorithms to the best hand-designed visualizations in the domain; for example, we have compared
our computer-designed instructions
to factory-produced instructions and
hand-drawn instructions for assembling a TV stand, finding that users
completed the assembly task about
35% faster and made 50% fewer errors
using our instructions. In addition
to completion time and error rate, it
is also possible to use eye-tracking to
determine how a visualization affects
the way people scan and process information.
6, 21 Such eye-tracking studies help us evaluate the effectiveness
many other
information
domains
could benefit
from a deeper
understanding
of the ways
visual-display
techniques affect
the perception
and cognition
of information.
of low-level design choices in creating
visualizations. Rigorous user studies
are especially important because they
also serve to validate the effectiveness
of the design principles on which the
visualizations are based.
However, how to design such quantitative studies is not always clear.
How should one visualization be compared against another visualization?
For example, in the domain of anatomical illustrations it is not clear how
to compare our cutaway illustrations
against hand-designed illustrations.
What task should we ask users to perform using the two illustrations? One
approach might be to measure how
quickly and accurately viewers locate
a particular organ of the body. However, if the task is to learn the location
of the organ, then both illustrations
would label the organ, and with labels, speed and accuracy are unlikely
to differ significantly. Our cutaways
and exploded views are also designed
to convey the layering relationship
between parts. So, an alternative task
might be to ask viewers to indicate the
layering relationships between parts.
But how can we ask them to complete
this task without leading them to
an answer? For many domains, like
anatomical illustrations, developing
a new methodology is necessary for
evaluating the effectiveness of visualizations and validating underlying design principles.
Conclusion
The approach we’ve outlined for iden-
tifying, instantiating, and evaluating
design principles for visual commu-
nication is a general methodology for
combining findings about human per-
ception and cognition with automated
design algorithms. The systems we’ve
built for generating route maps, tour-
ist maps, and technical illustrations
demonstrate this methodology can be
used to develop effective automated
visualization-design systems. Howev-
er, there is much room for extending
our proposed approach, and we hope
researchers improve on the methods
we have described. Future work can
take several directions:
Many other information domains
could benefit from a deeper under-
standing of the ways visual-display
techniques affect the perception and