Database Reference
In-Depth Information
privacy prevention surveyed in Chapter 11 that guarantee anonymity by design ,
we are quite still far from discrimination-freeness by design.
19.3 The Future of Discrimination
As the previous chapters of this topic indicate, there is a growing appetite in both
the private and public sector to try and predict what specific individuals will do in
the future based on what happened to them in the past. Entities with vast datasets
of personal information at their disposal try and utilize the information they have,
by using advanced analytical tools. The outcomes of these processes are
individualized forecasts and predictions: what the specific individual will
consume, where she will travel, will he be ill, or will she break the law or default
on a loan. The government is already using similar mechanisms to establish who is
most likely to lie on her tax return, or become a security risk at the border. 24 Yet
the reach of predictive modeling will not stop with these examples.
Stepping beyond privacy law and the way it might limit the concerns these
practices generate, law will also deal with these practices using other existing
doctrines. In doing so, lawmakers will be required to establish the legitimate
borders of this growing practice, while determining which predictive tasks are
acceptable and which go too far. One of the key doctrines which will surely be
called into play in this context is that of unfair discrimination . 25
Before proceeding, it is first worth mentioning that a shift to automated
predictive modeling as means of decision making and resource allocation might
prove to be an important step towards a discrimination-free society - at least in
terms of the salient features of discrimination as we understand them today. A
computerized decision-making process could be monitored in real time and
reviewed after the fact with ease. Therefore, discriminatory practices carried out
by officials and employees, that counter governmental or business policies, could
be limited effectively. Furthermore, the physical interaction between the decider
and the subject are usually non-existent. Thus, the sensory cues which usually
trigger discrimination - a different skin color, accent or demeanor - are removed
from the process, thus limiting additional opportunities for discriminatory
conduct. While these arguments might all prove true, it is possible that automated
practices substitute well accepted discrimination concerns with newer ones we
have yet to fully understand.
Connecting the broad notion of unfair discrimination to the novel practices of
automated prediction will prove to be an elaborate task for the next generation of
jurists. As discussed in Chapter 5 of this topic, discrimination is a broad term,
which generated a breadth of legal thought and case law. It is also a charged term,
which quickly triggers visceral reactions and responses. Discrimination seems
intuitively relevant to the issue at hand. It usually refers to instances in which
24 For a discussion of the deployment several systems, see Cate, F.H. (2008) at 447, and
referenced sources.
25 See, for instance, Solove, D. (2011).
Search WWH ::

Custom Search