Biomedical Engineering Reference
In-Depth Information
Reproducibility
Reproducibility
describes how close the measurements are when the same input is measured
repeatedly over time. When the range of measurements is small, the reproducibility is high. For
example, a temperature sensor may have a reproducibility of
0.1 C for a measurement range
of 20 Cto80 C. Note that reproducibility can vary depending on the measurement range. In
other words, readings may be highly reproducible over one range and less reproducible over
a different operating range.
Offset
Offset
refers to the output value when the input is zero, as illustrated in Figure 10.1.
Linearity
Linearity
is a measure of the maximum deviation of any reading from a straight calibra-
tion line. The calibration line is typically defined by the least-square regression fit of the
input versus output relationship. Typically, sensor linearity is expressed as either a percent
of the actual reading or a percent of the full-scale reading.
The conversion of an unknown quantity to a scaled output reading by a sensor is most
convenient if the input-output calibration equation follows a linear relationship. This sim-
plifies the measurement, since we can multiply the measurement of any input value by a
constant factor rather than using a “lookup table” to find a different multiplication factor
that depends on the input quantity when the calibration equation follows a nonlinear rela-
tion. Note that although a linear response is sometimes desired, accurate measurements are
possible even if the response is nonlinear as long as the input-output relation is fully
characterized.
Response Time
The response time indicates the time it takes a sensor to reach a certain percent (e.g.,
95 percent) of its final steady-state value when the input is changed. For example, it may
take 20 seconds for a temperature sensor to reach 95 percent of its maximum value when
a change in temperature of 1 C is measured. Ideally, a short response time indicates the
ability of a sensor to respond quickly to changes in input quantities.
Drift
Drift
refers to the change in sensor reading when the input remains constant. Drift can be
quantified by running multiple calibration tests over time and determining the corresponding
changes in the intercept and slope of the calibration line. Sometimes, the input-output relation
may vary over time or may depend on another independent variable that can also change
the output reading. This can lead to a
, as illustrated in
Figure 10.2. To determine zero drift, the input is held at zero while the output reading is
recorded. For example, the output of a pressure transducer may depend not only on pressure
but also on temperature. Therefore, variations in temperature can produce changes in output
readings even if the input pressure remains zero. Sensitivity drift may be found by measuring
changes in output readings for different nonzero constant inputs. For example, for a pressure
transducer, repeating the measurements over a range of temperatures will reveal how much
the slope of the input-output calibration line varies with temperature. In practice, both zero
zero
(or
offset
)
drift
or a s
ensitivity drift
Search WWH ::




Custom Search