Image Processing Reference
In-Depth Information
Image moments are defined as
M−
1 X
N−
1 X
x p y q g(x, y)
m pq =
(6.37)
y=0
x=0
where x and y define the pixel location and N and M the image size. We utilise moments
m
which essentially describe the centre of gravity of the breast regions.
Histogram features
Histograms record the frequencies of certain temperature ranges of the thermograms. In
our work we construct normalised histograms of both regions of interest. As features we use
the cross-correlation between the two histograms. From the difference histogram (i.e. the
difference between the two histograms) we compute the absolute value of its maximum, the
number of bins exceeding a certain threshold (0.01 in our experiments), the number of zero
crossings, energy and the difference of the positive and negative parts of the histogram.
Cross co-occurrence matrix Co-occurrence matrices have been widely used in texture
recognition tasks (Haralick, 1979) and can be defined as
and m
01
10
γ (k)
T i ,T j (I) = PR p 1 ∈I T i ,p 2 ∈I [p
∈I T j ,|p
−p
|= k]
(6.38)
2
1
2
with
|p 1 −p 2 |= max|x 1 −x 2 |,|y 1 −y 2 | (6.39)
where T i and T j denote two temperature values and (x k , y k ) denote pixel locations. In
other words, given any temperature T i in the thermogram, γ gives the probability that
a pixel at distance k away is of temperature T j . In order to arrive at an indication of
asymmetry between the two sides we adopted this concept and derived what we call a cross
co-occurrence matrix defined as
γ (k)
T i ,T j (I(1), I(2)) = PR p 1 ∈I(1) T i ,p 2 ∈I(2)
[p
∈I(2) T j ,|p
−p
|= k]
(6.40)
2
1
2
i.e. temperature values from one breast are related to temperatures of the second side. From
this matrix we can extract several features (Haralick, 1979). the ones we are using are
Homogeneity G = X
k
X
γ k,l
1 +|k−l|
(6.41)
l
Energy E = X
k
X
γ 2 k,l
(6.42)
l
Contrast C = X
k
X
|k−l|γ k,l
(6.43)
l
and
Symmetry S = 1− X
k
X
k,l −γ l,k |
(6.44)
l
We further calculate the first four moments m 1 to m 4 of the matrix
m p = X
k
X
(k−l) p γ k,l
(6.45)
l
Mutual information
The mutual information M I between two distribution can be calculated from the joint
entropy H of the distributions and is defined as
M I = H L + H R + H
(6.46)
Search WWH ::




Custom Search