Biomedical Engineering Reference
In-Depth Information
must undergo peer review. Hopefully, diligent reviews include considerations of possible ways that the
research conclusions are wrong. Any hypothesis that survives stringent and relentless testing is consid-
ered to be “confirmed.” Actually, the twentieth-century philosopher, Karl Raimund Popper (1902-1994),
argued that such hypotheses are simply “corroborated.”
There are two reasons that scientists and engineers are reluctant to confirm most hypotheses. First, the
experiment is tightly defined and restricted to the conditions under which it is conducted. Particularly
in engineering practice, numerous variables are not fully considered until they are tested in the real
world. That is why prototypes are so important as is follow-up to experimentation. Second, induction is
a difficult process for confirming anything. Inductive reasoning or inductive logic is a way of reasoning
where the premises of an argument support the conclusion but do not guarantee that the conclusion is
correct. Thus, unlike deductive reasoning, induction leaves uncertainty no matter how much evidence
supports a conclusion. In other words, we are drawing a general conclusion from our specific data and
findings.
Professional judgment depends heavily on inductive reasoning, such as when diagnosing cancer in a
patient. The oncologist combines a number of factors in characterizing the patient's status. This includes
pathological reports (itself inductive, since the pathologist judges whether the cell is cancerous based
on visual and other examinations of the cells, such as whether they have distinct nuclei), biomarkers
(e.g., enzymes), and other biomedical data. These empirical results, which in themselves may appear
unrelated, are integrated by the physician using inductive reasoning to come to general conclusions. This
same process is used by engineers in most design applications, where seemingly disparate information
is used to arrive at a workable design.
RESEARCH CONFLICT OF INTEREST
Bioengineering research has many of the same conflicts of interests that practitioners have, but they
come in different forms. For example, researchers may be tempted to be less than honest or at least
not as diligent in pursuing research that does not best serve their research purposes and agenda. These
conflicts can be financial or ideological (see Teachable Moments: Truth and Turtles), and sometimes
both.
It suffices to say that bioethical debates must be based on facts that are unassailable. Many divisive
elements separate the factions of bioethical debates, but facts should not be one of them. Honesty
should never be sacrificed for convenience. It is at times difficult for researchers immersed in science
to be objective. Even when we are looking at the same facts, we can come to very different moral
conclusions. Joe Herkert of North Carolina State University has observed that one division is between
types of rationality employed by engineers and other technical experts and the rationality employed by
social scientists and cultural experts. Some of the differences are shown in Table 5.1.
Perhaps the best advice to engineers regarding research integrity and avoiding intellectual and other
conflicts of interest comes from Richard Feynman:
The first principle is that you must not fool yourself - and you are the easiest person to fool. So you have
to be very careful about that. After you've not fooled yourself, it's easy not to fool other scientists. You just
have to be honest in a conventional way after that.
Search WWH ::




Custom Search