Information Technology Reference
In-Depth Information
Inductor: WEB-BASEd
dynAmIc ASSESSmEnt tooL
of time and effort on the part of the instructors.
The Inductor tool made a trade-off by providing
automated feedback in the form of hints, expert
explanations, and learning resources to students.
Our focus was on self-assessment and providing
Inductor as a supplementary resource to classroom
instruction.
Inductor was designed to be an online assessment
tool in which students answer multiple-choice
questions, select the invariant principle best ap-
plies to the circuit problem, and finally write an
explanation for their answer. What makes Inductor
a dynamic assessment environment is that students
were provided opportunities to learn from outside
resources while taking the test. Inductor not only
provided instruction for remediating misconcep-
tions, but it taught the invariants technique for
circuit problem solving. After choosing the in-
variant principle involved in a problem and then
selecting an answer, a student who is incorrect
on either the invariant or the answer receives im-
mediate feedback in the form of expert hints and
explanations emphasizing the invariant properties
of the circuit in the problem, and links to outside
resources such as circuit diagrams and tutorials.
Students could look up resources, then revise their
answers or choice of invariant principles involved,
and finally view a video of an expert explanation
for the solution to the circuit problem.
In a related study, Leonard, Dufresne, and
Mestre (1996) had physics students describe the
principles involved in physics problems and write
a justification for their answer. The instructors
also discussed problem-solving strategies dur-
ing their lectures, much like the invariant-based
explanations and techniques for problem solving
that we present through Inductor. They found
that the students who were taught problem solv-
ing strategies generated more correct answers
to problems, were less-dependent on surface
features of problems for selecting the principles
that governed problem solving, and better recalled
the major principles covered in the course months
later. The effort those instructors put into carefully
reviewing and grading all the students' writings
during the course provided valuable feedback and
learning opportunities for the students, but also
undoubtedly represented a significant investment
Pilot Study with Inductor
We ran a test study of the Inductor tool using our
DC and AC test questions with a small group of
first year electrical engineering students (N=6). All
students completed two 14 item multiple choice
tests using the online Inductor tool. The items
in both tests were matched so that they both had
the same level of difficulty. We wanted to see if
performance improved from the first test to the
second test, and also collect evidence for students'
improving their explanations of circuit behavior.
overall results from Pilot Study
Overall, the participating students scored an av-
erage 61% correct answers on the first test, and
82% correct on the second test, an improvement
of 21%. Five of the six participants showed an
improvement from the first to second test. As in
earlier research our group conducted, we found that
students, at least initially, had the most difficulty
with problems dealing with capacitors and other
dynamic components. In this study, however, by
the second test students were performing well in
all categories, showing the largest improvement
with DC capacitor circuit problems.
Student Explanations
Students initially revealed misconceptions (“high-
er resistance means more power is absorbed,”
“internal resistance rises,” “the internal resistance
is lower for low frequencies,” “John's battery will
be required to work harder to push current through
the larger resistor”), errors (“after long time, all
Search WWH ::




Custom Search