Databases Reference
In-Depth Information
The data privacy for Big Data is a serious business and is causing regula-
tors around the globe to set up a variety of policies and procedures. Recently,
the U.S. Federal Trade Commission settled a case with Facebook that now
requires Facebook to conduct regular audits. Facebook, Inc., agreed to submit to
the government audits of its privacy practices every other year for the next two
decades. The company also agreed to obtain explicit approval from users before
changing the type of content it makes public. 26 Similar processes have been put in
place at MySpace and Google. In many cases, consumers trade their privacy for
favors. For example, my cable/satellite provider sought to have my channel click
information shared with a search engine provider. They offered me a discount of
$10 if I would “opt-in” and let them monetize my channel suring behavior.
This leads us to several interesting possibilities. Let us say that a data scientist
uses the channel suring information to characterize a household as interested in
sports cars (for example, through the number of hours logged watching Nascar).
The search engine then places a number of sports car advertisements on the web
browser used by the desktop in that household and places a web cookie on the
desktop to remind them of this segmentation. Next, a couple of car dealers pick
up this “semi-public” web cookie from the web browser and manage to link this
information to a home phone number. It would be catastrophic if these dealers
were to start calling the home phone to offer car promotions. When I originally
opted in, what did I agree to opt-in to, and is my cable/satellite provider protecting
me from the misuse of that data? As we move from free search engines to free
emails to discounted phones to discounted installation services, all based on
monetization of data and advertising revenue, there is money for everyone, if
the data is properly protected against unauthorized use.
The irst part of the solution is a data obfuscation process. Most of the time,
marketers are interested in customer characteristics that can be provided without
Privately Identiiable Information (PII)—that is, uniquely identiiable information
about the individual that can be used to identify, locate, and contact an individ-
ual. We can possibly destroy all PII information, which may still provide useful
information to a marketer about a group of individuals. Now, under “opt-in,” the
PII can be released to a selected few, as long as it is protected from the rest. In
the preceding example, by collecting $10, I may give permission to a web search
engine to increase sports car advertisements to everyone in my Zip+4 while at the
same time expecting protection from dealer calls, which require a household-level
granularity. We can provide this level of obfuscation by destroying PII for house
number and street name while leaving Zip+4 information in the monetized data.
Search WWH ::




Custom Search