Information Technology Reference
In-Depth Information
they are also more likely to trust them. In general,
the survey respondents indicated that they found
privacy statements to be too long, too verbose,
too legalistic, and all the same. The study also
reports that consumers tend not to read privacy
policies when they have prior experience with
a company. These findings suggest that privacy
policies are an important trust-building vehicle in
initial interactions between companies and users.
Therefore, improving their quality is a worthwhile
effort that may result in stable relationships.
Privacy Preferences (P3P) launched in 1997 by
the World Wide Web Consortium in an attempt
to give users more control over their own data
(World Wide Web Consortium, 2003). The P3P
privacy technology saves users the need to read
online privacy policies, as P3P-enabled Web
browsers compare privacy policies in the P3P
format against users' preset privacy preferences
when they enter a site, and determines whether
a Web site's privacy policy reflects their personal
wishes for privacy (Presler-Marshall, 2000). If
implemented correctly, P3P theoretically renders it
unnecessary for users to read privacy policies, as
the browser notifies users in case of inconsisten-
cies (Reagle & Cranor, 1999). As of 2003, only
10% of Web sites had P3P-based privacy policies
(Cranor et al., 2003).
Reasons for its slow adoption include unre-
solved legal issues and the fact that companies
find it hard to squeeze their complicated privacy
policies into the more straightforward P3P scheme
(Thibodeau, 2002). However, companies using
P3P may project the image of an organization
willing to make an investment of time and re-
sources to increase the transparency of its data
handling practices (Turner & Dasgupta, 2003).
But, like conventional privacy policies, users with
P3P-enabled Web browsers still cannot be sure
whether companies adhere to what they state in
their P3P privacy policies (Delaney et al., 2003).
An additional hurdle is that users still are reluctant
to fill out a form and set their privacy preferences
in order to activate P3P (Ghosh, 2001). Thus, for
the time being, privacy policies still have to be
read by humans rather than Web browsers, and
so their quality is critical to users' trust in Web
sites.
additional remedies against privacy
Concerns
To enhance user trust in privacy practices, self-
regulatory privacy certification schemes have
emerged. These certification programs audit
privacy practices in exchange for a fee and allow
Web sites to display a privacy seal (trustmark),
if their practices adhere to the program's guide-
lines (Sinrod, 2001). The two most prominent
examples include seal pioneer TRUSTe, founded
by the Electronic Frontier Foundation and Com-
merceNet (Cranor, 1998) and the Better Business
Bureau's privacy seal BBBOnline (BBBOnline,
2004). Clearly, there is an inherent conflict of
interest in paid third-party certification, as the
certifying institutions are obviously interested
in adding seals but might be reluctant to remove
seals from sites that do not comply. The display of
such a seal of approval may add trustworthiness
and credibility to corporate Web sites, suggesting
that the company is willing to have its privacy
practices audited (Benassi, 1999). Miyazaki and
Krishnamurthy (2002) examined the effective-
ness of privacy seals and found that even the
mere display of a trustmark increases user trust.
Similarly, Nöteberg, et al. (2003) conclude that
privacy seals have a significant effect when a
company is not well known but are unnecessary
for high-reputation companies.
Another attempt to make data collection and
sharing more transparent is the Platform for
theoretiCal Foundation
Originally put forward by Berger and Calabrese
(1975), uncertainty reduction theory (URT) pro-
vides the framework for this study. Based on the
Search WWH ::




Custom Search