Database Reference
In-Depth Information
Similarly the weights of all other combinations is as follows:
1
.
5
if X
(
Sex
)=
f and X
(
Class
)=+
0
.
67
if X
(
Sex
)=
f and X
(
Class
)=
W
(
X
)
:
=
0
.
75
if X
(
Sex
)=
m and X
(
Class
)=+
2
if X
(
Sex
)=
m and X
(
Class
)= − .
12.3.1.3
Related Approaches
The authors of (Luong et al., 2011) propose a variant of k-NN classification for the
discovery of discriminated objects. They consider a data object as discriminated if
there exists a significant difference of treatment among its neighbors belonging to
the deprived community and its neighbors not belonging to it (that is, the favored
community). They also propose a discrimination prevention method by changing the
class labels of these discriminated objects. This discrimination prevention method
is very close to our massaging technique (Kamiran & Calders, 2009a), especially
when the ranker being used is based upon a nearest neighbor classifier. There is,
however, one big difference: whereas in massaging only the minimal number of
objects is changed to remove all discrimination from the dataset, the authors of
(Luong et al., 2011) propose to continue relabeling until all labels are consistent.
From a legal point of view, the cleaned dataset obtained by (Luong et al., 2011) is
probably more desirable as it contains less “illegal inconsistencies.” For the task of
discrimination-aware classification, however, it is unclear if the obtained dataset is
suitable for learning a discrimination-free classifier.
The authors of (Hajian, Domingo-Ferrer, & Martinez-Balleste, 2011; Hajian,
Domingo-Ferrer, & Martınez-Balleste, 2011) also propose methods similar to mas-
saging to preprocess the training data in such a way that only potentially non-
discriminatory rules can be extracted. For this purpose they modify all the items
in a given dataset that lead to the discriminatory classification rules by applying
rule hiding techniques on either given, or discovered discriminative rules. For an
extensive description of this technique, see Chapter 13 of this topic.
12.3.2
Changing the Learning Algorithms
In this section, we discuss the discrimination-aware techniques in which we mod-
ify the classification model learning process itself to produce discrimination-free
classifiers. For this purpose, we discuss the discrimination-aware decision trees con-
struction in which we modify the decision tree construction procedure to make them
discrimination-free.
12.3.2.1
Discrimination-Aware Decision Tree Induction
Traditionally, when constructing a decision tree (Quinlan, 1993), we iteratively re-
fine a tree by splitting its leaves until a desired objective is achieved. Consider the
dataset given in Table 12.1. Suppose we want to learn a tree over this dataset in
Search WWH ::




Custom Search