Biomedical Engineering Reference
In-Depth Information
neurotechnology lies in the pattern of basic underlying assumptions, and once these
are understood, one can easily comprehend other, more superficial levels and appro-
priately deal with them (Ihde 2009; Schein 2010; Benanti 2012a, 2012b).
We can find traces of this phenomenon in language. Latin roots of the English
word provocative are made by elements pro and vocatio : something that calls forth or
advances ( pro ) the human to draw up ideas, tasks, or reflections ( vocatio ). Certainly,
uses of neurotechnology—especially as related to national security and defense
agendas—are provocative. Simultaneously, the human act of response ( respondeo ) is
contained in roots of the English word responsibility . Morality and ethics are forged
by responsibility. Evidently, any such use of neurotechnology—inclusive of its oper-
ationalization in national security and defense—incur, if not demand, responsibility
in intent, planning, and action. Therefore, neurotechnology is intrinsically related to
ethics: Ethical questions do not arise around practical use of neuroscience; instead,
they are born and live in the essence of each neurotechnological artifact (Ihde 2009;
Benanti 2012a, 2012b). To remove opaque terminology from neuroscience, we must
distinguish between neuroscience and neurotechnology. Morality and ethics are ele-
ments built from a declaration that neurotechnological artifacts have a nonneutral
moral constitution. The use of such artifacts is intrinsically involved in the process
that brings neurotechnology to the market. Cultural needs are infused into moral
choices, and these indirectly offer the supposed promise(s) of national security via
neuroscience. So, we must ask why we are developing these tools instead of others—
why do we need some kind of neurotechnology, and/or what kind of human relation-
ships will—or should—these artifacts forge? (Ihde 2009).
CONTEMPORARY ETHICAL PARADIGMS FOR
EVALUATING NEUROTECHNOLOGY
To develop perspectives in neuroethics, it cannot be ignored that some forms and
extent of ethical evaluation for the use of various neurotechnologies are already in
place and being applied. I would like to summarize these ethical evaluations before
I offer my own perspective. Looking at ethical arguments to evaluate the use or mis-
use of neurotechnology, I found three recurrent paradigms that I have called: (1) fear
of the uncertain; (2) pursuit of equality and happiness; and (3) emphasis on policies
(Benanti 2012a, 2012b).
The first paradigm, fear of the uncertain, is used to regulate or mitigate use
of neurotechnology in a double sense. Some ethicists argue that we should use
only those neurotechnologies that can be previewed and controlled. In this way,
neurotechnology use will be safe and protected by unwanted effects. In another
way, some ethicists assert that the future of national defense and security is
what really remains uncertain, and thus, only the concerted use of neurotech-
nology can transform uncertainty to any realistic form of national safety. Both
are focused on fear: fear of what can happen in the future if neurotechnology
is either allowed or disallowed in national security scenarios. I believe that we
cannot allow fear to play such a prominent, if not preemptive role in neuroethi-
cal assessment and adjudication of neurotechnological applications in national
defense agendas (Benanti 2012b).
Search WWH ::




Custom Search