INTERACTIONS.ACM.ORG 84 INTERACTIONS NOVEMBER–DECEMBER2018
FORUM EVALUATION AND USABILITY
Takeaway: Use expert reviews when
usability and subject matter experts
are available and resources are scarce.
I would like to add a personal warning
that was not tested in the CUE-studies:
Use expert reviews with great care in
organizations that have a low usability
maturity.
Unusable test reports. The quality
part of a market basket of evaluation
methodologies.
Expert reviews are useful. CUE- 4
indicated that expert reviews produce
results of a quality comparable to
usability tests—at least when carried
out by experts.
Most issues reported by usability
testing were also reported by expert
reviews and vice versa. Few false
problems were identified. As expected,
expert reviews seem to require slightly
fewer resources than usability tests.
CUE- 3 indicated that professionals
with limited experience may have
problems using expert reviews. This
is one of the few places in the CUE
studies where we saw false alarms.
Study Teams Time What each participating team did
CUE- 1 4 March 1998 Usability test of Task Timer for Windows
CUE- 2 9 December 1998 Usability test of Hotmail.com ( 7 professional teams, 2 student teams)
CUE- 3 11 September 2001 Usability inspection of Avis.com
CUE- 4 17 March 2003 Evaluated the usability of HotelPenn.com [ 2]
CUE- 5 13 August 2005 Evaluated the usability of IKEA’s online wardrobe planner
CUE- 6 13 October 2006 Evaluated the usability of Enterprise.com
CUE- 7 8 March 2007 Made recommendations for fixing six usability problems on IKEA.com [ 3]
CUE- 8 15 June 2009 Measured the usability of key tasks on Enterprise.com [ 4]
CUE-9a 17 June 2011 Analyzed five videos of usability tests of UHaul.com – Atlanta US [ 5]
CUE-9b 18 August 2011 Analyzed five videos of usability tests of UHaul.com – Chemnitz DE [ 5]
CUE- 10 16 May 2018 Moderated three test sessions of Ryanair.com; sessions were video recorded
→ Table 1. An overview of the 10 CUE studies that have been conducted to this point.
Study CUE- 2 CUE- 4 CUE- 9
Number of participating teams 9 17 35
Total number of issues = problems + positive findings 310 340 223
Issues reported by
– All participating teams 0 0 0
– More than 75% of the teams 1 3 4
– 40% to 75% of the teams 10 17 18
– At least 3 teams but less than 40% of the teams 17 64 76
– 2 teams 50 51 35
– Single teams only 232 205 90
75% 60% 40%
Serious or critical problems reported by single teams only 29 61 17
→ Table 3. Number of issues reported by one or more teams.
Huge number of issues The total number of usability issues for a modern website is huge, 300 or more.
Five users are not enough Five users are not enough to find 75% or even 25% of the usability problems on a website.
No gold standard Usability testing is not the perfect method against which all other methods can be measured.
Expert reviews are useful Expert reviews produce results of a quality comparable to usability tests.
Unusable test reports The quality of the usability test reports varied dramatically.
Task design There was virtually no overlap between tasks used by different teams.
Few false alarms Almost all reported issues were valid.
→ Table 2. Key findings from CUE studies.