Database Reference
In-Depth Information
doing so, they must figure out ways in which “search” related interests could be
answered within the governmental data mining analysis process.
As explained above, every such theory calls for a different set of balances and
findings. However, the implications of these theories - if they are indeed accepted
in the data mining contexts - run deeper. Every one of the theories mentioned
above might point regulators in a different regulatory direction when considering
ancillary privacy rights to overcome the concerns at hand. The first theory points
to the sense of intrusion data mining generates. If this is the privacy-based theory
which generates concerns in the data mining context, then this concern could be
partially mitigated by a greater degree of transparency in the data mining process.
With additional knowledge as to the process, the public aversion might be limited.
Therefore, accepting this theory should promote this ancillary right.
The second theory points in a different direction. Addressing this concern
should call for various measure for assuring that data mining analysis are only
used for the specific tasks they are needed for the most. In other words, steps must
be taken to assure that the use of these methods does not “creep” into other realms.
This could be achieved by both technological measures (which safeguard the use
of these tools) and a regulatory structure which closely supervises these uses.
Again, the theoretical perspective can point policymakers (if convinced by this
argument in the relevant context) in the direction of relevant (yet different)
ancillary rights.
The third theory might call for yet a different regulatory trajectory in terms of
regulatory steps. The concern it addresses relates to the very nature of the data
mining practice - one that finds its broad scope problematic. Therefore, this theory
might indeed lead to limiting data mining analysis. Another possible option might
call for engaging in the mining of anonymized data - a practice which might
somewhat mitigate these concerns (yet raises others) (Zarsky, 2012). Arguably, if
the search is of anonymized data, the interests of the many subjected to it are not
compromised by the vast net the data mining practice apply. Therefore the use of
this measure would be found proportionate.
As there is probably a kernel of truth in every one of the theories, it would be
wise to take all these proposals under consideration. However, at some points they
might prove contradicting. Therefore, additional analytical work must prioritize
among them, while relying on social norms and the balancing of other rights. This
analysis of course must be context-specific, as in different contexts the relative
force of every theory will vary.
In conclusion, existing risks call for analyzing and using personal information
in an effort to preempt possible harms and attacks. Society will be forced to decide
among several non-ideal options. At the end of the day, the solution selected will
no doubt be a compromise, taking into account some of the elements here set
forth. The theoretical analysis here introduced strives to assist in the process of
establishing such a compromise, while acknowledging that there is still a great
deal of work to be done.
Search WWH ::




Custom Search