Biomedical Engineering Reference
In-Depth Information
argue that the instructions and warnings accompanying products should be given sufficient consideration,
and if a user willfully ignores this information, then this constitutes the user's autonomy in a rational
decision made with “informed consent.” Others disagree holding that liability extends beyond labeling
and reasonable use. 28
Economists have found that policies designed to protect the public may inadvertently lead to “atten-
uation and even reversal of the direct policy effect on expected harm … because of offsetting behavior
(OB) by potential victims as they reduce care in response to the policy.” 29 Much of the research has
been related to transportation, especially road safety. For example, drivers with antilock brake systems
tend to tailgate more closely and the use of helmets has not been commensurate with expected injury
prevention in cycling. It is logical that such behavior would also extend to medical devices.
The offsetting behavior can even be a factor in macroethics. For example, birth control devices and
drugs have clearly affected sexual and drug use behaviors. Even the public response to the acquired
immunodeficiency syndrome (AIDS) epidemic was colored by offsetting behavior. For example, many
did not question the morality of multiple partners so much as the failure of medical technology (e.g., a
drug or a vaccine or a device) to alleviate the sexual- and drug-related transmissions of the disease. To
many, it was not an ethical problem so much as a technological one. Drug users needed cleaner syringe
needles and potential, sexual partners needed quicker and more reliable human immunodeficiency virus
(HIV) screening technologies. The offsetting behaviors resulting from reliance of technologies (birth
control pills, intrauterine devices, etc.) were clearly a factor in the changes in social mores following
the 1960s, and an indirect societal challenge in trying to address the impending epidemic.
Artifacts
Engineers are vulnerable to artifacts. That is, their designs may lead to impact down the road. Biomedical
risk assessment should not be limited to biomedical engineers. For example, biomedical enterprises
can be harmed by failure in any engineering discipline. One of the most notorious biomedical device
failures had little to do with biomedical engineering and much to do with computer science and software
engineering. A linear accelerator (i.e., linac) known as Therac-25 was used in the 1980s for X-ray
medical radiation therapy. In the United States and Canada, six accidents associated with this device were
reported between June 1985 and January 1987. These accidents led to substantial overdoses resulting in
deaths and serious injuries. The accidents resulted from a design artifact:
Between the patient and the Therac-25's radiation beam was a turntable that could position a window or
an X-ray-mode target between the accelerator and patient, depending on which of two modes of operation
was being used. If the window was positioned in the beam's path with the machine set to deliver radiation
through the X-ray-mode target, disaster could result because software errors allowed the machine to operate
in this configuration. 30
In the Therac-25 case, the offsetting behavior was not that of the patient, but of the technical user. Users
evidently became overly reliant on the software. In particular, the computer-aided system eliminated
the previous Therac model's need for the operator to enable independent protective circuitry, as well
as mechanical interlocking devices to prevent radiation overdosing. The “improved” model relied to
a much larger extent on software to provide such protections. What made matters worse was that
Search WWH ::




Custom Search