Database Reference
In-Depth Information
proaches: data-driven, motivated from previous research, or theory-driven, each with
respectively decreasing levels of sensitivity to the data [10]. In the first style, data-
driven, commonly called open coding [14]; themes and a code set are derived directly
from the data and nothing else. If the analysis is motivated by previous research, the
questions and perhaps codes from the earlier research can be applied to the new data
to verify, extend or contrast the previous results. With theory-driven coding one may
think using a given theory, such as grounded theory [13], or ethno-methodology [24],
as a lens through which to view the data.
In either case the coded data may then be interpreted in more generalized terms.
Qualitatively coded data may then be used with quantitative or statistical measures to
try and distinguish themes or sampling groups.
5.4
Qualitative Summary
Qualitative studies can be a powerful methodology by which one can capture salient
aspects of a problem that may provide useful design and evaluation criteria. Quantita-
tive evaluation is naturally precision-oriented, but a shift from high precision to high
fidelity may be made with the addition of qualitative evaluations. In particular, while
qualitative evaluations can be used throughout the entire development life cycle in
other research areas such as CSCW [41, 52, 64, 73], observational studies have been
found to be especially useful for informing design. Yet these techniques are under-
used and under-reported in the information visualization literature. Broader ap-
proaches to evaluation, different units of analysis and sensitivity to context are impor-
tant when complex issues such as insight, discovery, confidence and collaboration
need to be assessed. In more general terms, we would like to draw attention to qualita-
tive research approaches which may help to address difficult types of evaluation ques-
tions. As noted by Isenberg el al. [36], a sign in Albert Einstein's office which read,
' Everything that can be counted does not necessarily count; everything that counts
cannot necessarily be counted ' is particularly salient to this discussion in reminding
us to include empirical research about important data that can not necessarily be
counted.
6
Conclusions
In this paper we have made a two-pronged call: one for more evaluations in general
and one for a broader appreciation of the variety of and importance of many different
types of empirical methodologies. To achieve this, we as a research community need
to both conduct more empirical research and to be more welcoming of this research in
our publication venues. As noted in Section 4, even empirical laboratory experiments,
as our most known type of empirical methodology, are often difficult to publish. One
factor in this is that no empirical method is perfect. That is, there is always a trade-off
between generalizability, precision, and realism. An inexperienced reviewer may rec-
ommend rejection based on the fact that one of these factors is not present, while realis-
tically at least one will always be compromised. Empirical research is a slow, labour-
intensive process in which understanding and insight can develop through time. That
said, there are several important factors to consider when publishing empirical re-
search. These include:
Search WWH ::




Custom Search