Biomedical Engineering Reference
In-Depth Information
program. This means that the full range of evaluative
approaches is available to the formative evaluator, including
experimental designs (i.e., RCT) as well as quasi-experimental
designs.
12.3 The good news, part 1
It is not the case that training intervention must remain
invariant. Program enhancement, in light of feedback from a
formative evaluation, can take place concurrently with an
evaluation in the framework of the RCT experimental
design. Of course, desirable modifi cation practice does not
(or should not) mean a hodgepodge array of “random”
interventions resulting from poor program defi nition; this
should have been pre-empted in the design phase of the
program improvement model. Nor should “random”
interventions result from the capacity of those who implement
training programs to understand adequate defi nitions; that
should have been addressed in the implementation phase. 14
It makes no difference, for the experimental method,
whether an evaluative judgment of program ineffectiveness
is available for program adaptation or not. It makes no
difference, for the experimental method, whether changes in
training intervention are implemented or not. Evaluators can
fi ll their mandate for dissemination of timely data and
concomitant programmatic change.
The evaluator can realize, based on an on-going program
evaluation, that “training intervention G will not produce
the desired results.” The intervention can be revised in the
“middle of the stream,” so to speak, and evaluators can still
complete their formative evaluation.
The dissemination of evaluative fi ndings through an
appropriate study monitoring committee, and a managerial
￿ ￿ ￿ ￿ ￿
Search WWH ::




Custom Search