Information Technology Reference
In-Depth Information
To protect location privacy from inference attacks, an effective approach is
based on the notion of anonymity: users in physical proximity can coordinate their
pseudonym changes to happen simultaneously [ 1 ], so that the adversary cannot
link their pseudonyms before the changes to their respective pseudonyms after the
changes. Existing studies [ 2 , 3 , 4 ] assume that all users participating in pseudonym
change have the same anonymity set. However, based on an individual user's belief
of the adversary's power against its location privacy (e.g., the adversary's side infor-
mation about that user), the set of users that it believes can obfuscate its pseudonym
(i.e., its anonymity set) can be different from that of another user. Thus motivated,
we consider a general anonymity model that can meet users' needs for personalized
location privacy, depending on users' physical locations. In particular, each user
specifies an anonymity range (a physical area) such that the set of users within the
anonymity range constitute that user's potential anonymity set . For example, a user's
anonymity range can be a disk centered at the user's location, with a large radius
indicating a low level of privacy sensitivity (as illustrated in Fig. 5.1 ). Note that for
two users at different locations, their anonymity ranges are different even when they
have the same shape (e.g., two disks with the same radius but different centers), and
thus their potential anonymity sets can be different.
Formally, consider a set of users
N {
···
}
where each user i makes
a decision a i on whether or not to participate in pseudonym change, denoted by
a i =
1,
, N
0, respectively. Based on users' physical locations, the privacy gain
perceived by a user participating in pseudonym change depends on which users also
participate. Each user i incurs a cost of c i > 0 to participate in pseudonym change.
This cost is due to a number of factors, e.g., the participating users should stop using
the LBS for a period of time. Based on the general anonymity model, the physical
coupling among users can be captured by a physical graph (
1 and a i =
P ), where user j
N
,
E
is connected by a directed edge e ji E
P
to user i if user j is in user i 's potential
P
i
P
i
anonymity set, denoted by
). Note that the physical coupling
between two users can be asymmetric. The privacy gain perceived by a participating
user i is defined as its anonymity set size , i.e., the number of participating users in
N
(i.e., j
N
i
.
Note that the anonymity set size is a widely adopted privacy metric 1 for anonymity-
based approaches. For example, k -anonymity is used as the privacy metric in [ 4 , 5 ],
where a user achieves location privacy if its pseudonym cannot be distinguished
among k users. Then the individual utility of user i , denoted by u i ,isgivenby
N
u i ( a i , a i )
a i
a j
c i
(5.1)
P
i
j
N
1 Another privacy metric is the entropy of the adversary's uncertainty of a user's pseudonym.
However, it is usually difficult to compute since it requires probability distribution which is difficult
to attain.
Search WWH ::




Custom Search