Biomedical Engineering Reference
In-Depth Information
current interest in using functional neuroimaging (fMRI) for detecting deception
(NAS 2008). The validity of this approach depends on the accuracy with which
such technologies can detect psychological states relevant to deception (such as anxi-
ety vs. something more abstract, e.g., cognitive dissonance). How science portrays
the relationship between patterns that may be detected by neuroimaging and what
those patterns actually represent depends upon (neuroscientific) interpretation of the
validity of the technology (to actually do what it is intended) and, in light of this,
the meaning of data and information, as constituting viable knowledge (Uttal 2001;
Farah 2005; McFate 2005; Illes 2006; Giordano 2012).
An illustration of how (mis)conceptions of causal relationships of the brain and
cognition can constitute a rationale for employing technologies or tactics is reflected
by the following quote, from the May 4, 2009, issue of Newsweek . In this context,
the speaker is asking the contractor who will replace him about a given approach to
interrogation.
. . . I asked [the contractor] if he'd ever interrogated anyone, and he said 'no, but that didn't
matter', the contractor shot back, 'Science is science. This is a behavioral issue.' He told
me he's '. . . a psychologist and . . . knows how the human mind works' (Isikoff 2009).
This is relevant to the use of neurotechnology as it reflects a social tendency to con-
cretize contingent neuroscientific understanding as “truth.” To be sure, neuroscience
is an iterative enterprise. Thus, applications and use(s) of neurotechnology remain
works-in-progress. Still, neurotechnology can be used to create weapons that may
have an unprecedented capacity to alter the cognitions, emotions, beliefs, and behav-
iors of individuals, and groups—if not societies. Thus, the potential “power” of neu-
rotechnology as weaponry lies in the ability to assess, access, and change aspects of
a definable “self.” As with any weapons, they pose threats to autonomy and free will
and can do so to an extent that psychological weapons alone could not.
It is foolhardy to think that the technological trend that compels the use of neu-
rotechnology as weapons will be impeded merely by considerations of (1) the bur-
dens and risks that might arise as science advances ever deeper into the frontiers of
the unknown; (2) the potential harms that such advances could intentionally and/or
unintentionally incur; and (3) the ethico-legal and social issues instantiated by both
the positive and negative effects and implications of these advances. This is because
a strong driving (or “pushing”) force of both science and technology is the human
desire(s) for knowledge and control. At the same time, environmental events, market
values, and sociopolitical agendas create a “pulling force” for technological progress
and can dictate direction(s) for its use. Both former and latter issues are important to
national security and defense. In the first case, the use of contingent knowledge could
evoke unforeseen consequences that impact public safety, and the power conferred by
scientific and technological capability could be used to leverage great effect(s). In the
second case, the intentional use of these technologies by individual agents or groups in
ways that are hostile could incur profound public threats.
Thus, a simple precautionary principle in which risk-benefit ratios determine the
trajectory and pace of technological advancement is not tenable on an international
level, as there is the real possibility—if not probability—that insurgent nations and/or
Search WWH ::




Custom Search