Information Technology Reference
In-Depth Information
suggestion is that specialists or experts evaluate early versions (low- or
high-fi prototypes), and then users evaluate later versions [17].
Heuristics can be used in other ways than as described by the standard
heuristic evaluation procedure. In [52], Tory and Möller created a set of
heuristics used as statements in a questionnaire. After solving tasks using
different visualization techniques, the participants rated on a 7-point scale
how much they agreed with the statements. In [53], the authors applied
heuristics in a post-test questionnaire where participants rated their
agreement with heuristics assessing their experience using the
visualization system. This questionnaire was utilized for a post-test
interview guide when discussing and reflecting on the outcome of the
evaluation with each participant. In [40], the authors used the heuristics set
proposed by Forsell and Johansson [7] when analysing and structuring the
results from a qualitative user study. The 10 heuristics created categories
into which the results were sorted and analysed.
Some More Advice
Heuristics in an evaluation are meant to guide evaluators to find and
explain usability problems, but they should not restrict them to finding
only the problem explained by the heuristics. All problems should be
reported. Standardized sets such as Nielsen's Ten usability heuristics [28]
are developed to provide a high degree of explanatory cover. When
researchers develop their own sets, based on published heuristics and/or
their own “home-grown” ones, it is impossible to predict their explanatory
power.
As an evaluation manager, remain neutral. When interacting with the
evaluators, if applicable, one should prompt them in an unbiased way and
ask unbiased questions. This will motivate them to give honest responses.
It is essential to assure evaluators that they should not feel any pressure to
respond positively to you, the evaluation manager. On the contrary, they
should be encouraged to point out problems and give constructive critique,
since obtaining such feedback is the aim of the evaluation. Do not defend
the visualization technique or the procedure if there is a “failure” during
evaluation. If things go wrong, blame the equipment, or yourself, or say
the visualization, the tasks, etc. were hard to understand/perform.
The method may seem overly critical, because it is used to identify
usability problems and not for finding good features. When reporting
results from heuristic evaluations, include positive feedback as well.
Search WWH ::




Custom Search