Biomedical Engineering Reference
In-Depth Information
5. Legal rules governing neurotechnology must achieve fair play and justice
in ways that avoid social disruption, invasion of privacy, or coercion.
Neurotechnology offers the potential to change the way we see ourselves
and think about society. That may prove beneficial. It may also produce
outcomes that disrupt society and raise questions about discrimination
and fair play by empowering a new elite who may have abilities unavailable
to others. This is not far-fetched.
The military is testing exoskeleton suits that will vastly increase the
capacity of soldiers to function. These include Cyberdyne's HAL-5, which
interprets faint electrical signals in the skin around damaged muscles and
moves motorized joints in response (Ponsford 2013). That technology is
likely to have civilian applications. Zac Vawter used a thought-controlled
bionic leg to climb all 103 floors of Chicago's Willis Tower. Over 220,000
people have cochlear implants. Deep brain stimulators used for Parkinson's
disease are being tested to treat a number of other neurological conditions,
as well as psychiatric disorders and states. Artificial limbs have already
provoked controversy in Olympic competition, as some have complained
that bionic limbs provide an unfair competitive edge (Naam 2013). Similar
complaints may arise from those who argue than inequitable access to neu-
rotechnology provides unfair enhancements for performance—perhaps
to enhance memory recall or to accelerate learning—in the workplace
(Lynch 2004).
How do we apportion the benefits of neurotechnology equitably so
that broader goals of fair play and justice are realized? What rules must
be put in place to ensure that individuals remain free from its use in
coercion? These raise legal and policy questions, to achieve justice and
avoid creating classes of victims suffering from new forms of invidious
discrimination.
6. At what point does weaponized neurotechnology become so lethal as to
constitute a prohibited weapon under international norms?
Open-source data do not reveal the current existence of such neurotech-
nology. But conceptually, it might be developed. Now is the time to think
through what standards might define illicit levels of development, testing
and use, and what rights states have to protect their populations against the
use of that technology.
At present no treaty specifically applies to neurotechnology. Efforts to
forge controls of cybermalware have proven—and are likely, outside of nar-
row criminal activities such as banning spam or child pornography—to be
fruitless. We should not deceive ourselves into believing that forging an
international accord on weaponized neurotechnology would be any easier.
The issues are complicated and nuanced. On the one hand, the use of such
technology should conform to the Law of Armed Conflict (LOAC).
Equally, how would LOAC considerations address the challenge posed
by the notion of “unrestricted warfare,” (Lang and Xiangsui 2007) in which
there are no limits, no rules, no boundaries, battlefields, and sovereign
places? Victory in engagement and conflict occurs through the “principle
Search WWH ::




Custom Search