what-when-how
In Depth Tutorials and Information
Due to the co-ownership, the privacy preferences of different owners about the
shared data may conflict. As a result, some owners' actions may breach others' pri-
vacy. hus, it is necessary to employ collaborative privacy management that must
satisfy the requirements: content integrity, semiautomated, adaptive, and group-
preference.
In order to aggregate the privacy setting decisions of the co-owners, it is neces-
sary to balance the time complexity and fairness of the algorithm. hus, Reference
14 proposes an incentive-based mechanism for users to share data and leverage the
decisions about their privacy. It makes use of a credit-based system in which the user
earns credits proportional to the amount of data the use discloses, as a co-owner, and
to the number of times it grants co-ownership to potential owners. For example, the
originator i gains c for sharing data s with n co-owners: elevate c m
=
+
(
β
×
m
)
×
n
,
i
i
where m i and β × m i ( β ∈ [ , ]
0 1 ) are the credits for the data the originator discloses
and for each user accepted as a co-owner respectively. And each user accepted as
a co-owner of the data gains α × m i ( α ∈ [ , ]
0 1 ). It can be inferred that the credit
of a user is based on the importance of the user's preferences in making the group
decision. v i ( g ) represents the value of the user i choosing the privacy g . hus, the
collective decision value is defined as F v g
( ( ),..., ( ) 1 = , where F ( . ) is a collec-
tive function which is designed for the optimality characteristics according to Game
heory in Reference 14, maximizing the collective value. his approach has three
advantages. It is (1) simple, (2) nonmanipulable, and (3) fair.
Reference
v
g
X
n
14
adopts
the
additive
social
utility,
which
means
that
n
= ∑ = . hus, we select the privacy setting which maxi-
mizes the collective social value: g
F v g
(
(
),...,
v
(
g
))
v g
(
)
1
n
i
1
i
*
=
argmax
= 1
i n
v g
(
)
. After selecting g , each
i
= ( ,
π
π
user is required to pay tax i . he utility of the choice c
g
,...,
)
is repre-
1
n
sented by u c
( ) ( = − π . Reference 14 utilizes the Clarke Tax mechanism that
maximizes the social utility function by encouraging truthfulness among the indi-
viduals, regardless of other individuals' choices. hus, the tax of a user is computed
as below:
v g
i
i
i
π i
=
(
g
*
)
v
(argmax
v
(
g
))
v
(
g
*
)
.
j
k
j
g G
j
i
k i
j
i
he most important feature of the Clarke Tax mechanism is that it ensures the
incentive of users to keep honest in their transactions. he evidence can be found
in Reference 15.
In order to automatically select the privacy preferences for each item of data,
Reference 14 makes use of inference-based techniques that use tags and similar-
ity analysis to infer the best privacy policy based on previous preferences of users
about shared data. An item of data having k tags can be defined as a vector of
tags: t
= {
}
= {
t
1 ,...,
t k
. A set of shared items is denoted as T
t
,
t
,...,
t n
}
. Based on
1
2
Search WWH ::




Custom Search