Information Technology Reference
In-Depth Information
Paper-based Sharing of Annotations We assessed the percentage of annotations
that were tagged with visibilities. Tagging with visibilities was performed only for
a small percentage of annotations. An average of 2.4 % of all annotations was clas-
sified as private. An average of 1.6 % was classified as public. Despite these low
scores, we do not conclude that privacy mechanisms are not necessary. In all in-
terviews, it became obvious that the users require a functionality for defining the
visibility of annotations. Rather, the low scores reflect that the default setting of
group visibility is appropriate for most annotations made during a lecture.
In the interviews, there was a wide range of responses to the functionality for
classifying annotations. While nearly all participants agreed that this is an important
feature, that tapping on a button is quick and easy and does not disrupt the main task
of annotating, they disagreed about whether the system feedback is sufficient. Many
users reported to feel unsure whether a printed button has been correctly activated
when tapping on it with the pen. While the pen confirms the pen tap on a button
by briefly lighting up a LED, it provides no feedback on the currently activated
classification mode. This was not possible with the Anoto pens available at the time
of the evaluation. Novel pens with integrated feedback capabilities could make the
classification with paper buttons more reliable.
5.4.2 Study II: Laboratory Study of Annotation Review
A second exploratory study assessed the use of CoScribe during review after class.
In this setting, time is less scarce than during a lecture and learners can make use of
the system's entire functionality. We evaluated the use of the CoScribe viewer for
collaborative activities and the combined use of paper and digital documents.
Method
We recruited nine students (7 male, 2 female) among the participants of the first
study. Each participated to a single-user session which lasted about one hour. Par-
ticipation was voluntary and no compensation was given.
The participant was given an Anoto pen, a twenty page printout of slides of an
introductory computer science lecture and several Digital Paper Bookmarks (see
Section 7.1). He or she was seated at a table with enough free space for the paper
documents. A computer screen on the table as well as a keyboard and a mouse
provided access to the CoScribe viewer.
The sessions were structured as follows: At first, the participant was trained for
five minutes on how to use the CoScribe viewer and Digital Paper Bookmarks. In
the following, we requested the participant to perform given tasks with paper and
the CoScribe viewer. This comprised creating annotations and bookmarks on paper
as well as using the software viewer to modify own annotations. Next, we evalu-
ated the appropriateness and compared the multi-user and the single-user visualiza-
Search WWH ::




Custom Search