Database Reference
In-Depth Information
That the empirical methodology was sensitively chosen. The methodology
should be a good fit to the research question, the situation and the research goals.
That the study was conducted with appropriate rigor. All methodologies have
their own requirements for rigor and these should be followed. However, while
trying to fit the rigor from one methodology onto another is not appropriate, de-
veloping hybrid methodologies that better fit a given research situation and
benefit from two or more methodologies should be encouraged.
That sufficient details are published so that the reader can fully understand the
processes and if appropriate, reproduce them.
That the claims should be made appropriately according to the strengths of the
chosen methodology. For instance, if a given methodology does not generalize
well, then generalizations should not be drawn from the results.
While there is growing recognition in our research community that evaluation infor-
mation visualization is difficult [55, 57, 67], the recognition of this difficulty has not
in itself provided immediate answers of how to approach this problem. Two positive
recent trends of note are: one, that more evaluative papers in the form of usability
studies have been published [25, 40, 47, 63, 80, 82], and two, that there are several
papers that have made a call for more qualitative evaluations and complementary
qualitative and quantitative approaches [18, 36, 48, 74].
This paper is intended merely as a pointer to a greater variety of empirical method-
ologies and encouragement towards their appreciation and even better their active use.
There are many more such techniques and these types of techniques are being devel-
oped and improved continuously. There are good benefits to be had through active
borrowing from ethnographic and sociological research methods, and applying them
to our information visualization needs. In this paper we have argued for an increased
awareness of empirical research. We have discussed the relationship of empirical
research to information visualization and have made a call for a more sensitive appli-
cation of this type of research [27]. In particular, we encourage thoughtful application
of a greater variety of evaluative research methodologies in information visualization.
Acknowledgments. The ideas presented in this paper have evolved out of many dis-
cussions with many people. In particular this includes: Christopher Collins, Marian
Dörk, Saul Greenberg, Carl Gutwin, Mark S. Hancock, Uta Hinrichs, Petra Isenberg,
Stacey Scott, Amy Voida, and Torre Zuk.
References
1.
Amar, R.A., Stasko, J.T.: Knowledge Precepts for Design and Evaluation of Information
Visualizations. IEEE Transactions on Visualization and Computer Graphics 11(4), 432-
442 (2005)
2.
Andrews, K.: Evaluating Information Visualisations. In: Proceedings of the 2006 AVI Work-
shop on BEyond Time and Errors: Novel Evaluation Methods for Information Visualiza-
tion, pp. 1-5 (2006)
3.
Auerbach, C.: Qualitative Data: An Introduction to Coding and Analysis. University Press,
New York (2003)
Search WWH ::




Custom Search