Digital Signal Processing Reference
In-Depth Information
f ij
of the marginal distribution
(
x j
)
, i.e.,
w Mi j
φ b
f ij
f ij
(
x j
)
dx j
=
(
x j
)
dx j
(11.56)
φ a
w Mi j
f ij (
with [
. Equations 11.24 and 11.54 give an im-
plicit definition of the stationary state of SOM and MMSOM, respectively. To
maintain simplicity, we confine the analysis to the 1D case (Equation 11.49).
Without loss of generality, we assume that m 1 <
φ a ,
φ b ] the domain of
x j )
m 2 . Obviously, to decide if an
observation x belongs to the class
C i described by the pdf
N (
x ; m i ,
σ)
, i
=
1 , 2,
we need a threshold T , such that if x
2 .Ashas
already been said, the nearest mean reclassification algorithm would yield
the threshold given by Equation 11.50. It can be shown that
T , x
∈ C
1 ; otherwise, x
∈ C
ln
,
2
m 1 +
m 2
σ
T opt =
(11.57)
2
m 1
m 2
1
where the subscript “opt” implies that T opt minimizes the probability of false
classification. From Equation 11.57, it is seen that only for
=
0
.
5,
m 1
+
m 2
T opt
=
=
T mid
.
(11.58)
2
The SOM yields the following Voronoi neighborhoods:
V
(
W
) ={
x
T SOM
}
V
(
W
) ={
x
>
T SOM
} .
(11.59)
1
2
Let us assume that T SOM
=
T is known. Then, by using Equation 11.24, we
obtain
1
2 [
1
w
=
m 1
+ (
1
)
m 2 ]
1
F
(
T
)
m 1 erf T
m 2 erf T
2
m 1
m 2
+
+ (
1
)
σ
σ
π
exp
2
exp
2 ,
T
T
1
2
m 1
1
2
m 2
×
+ (
1
)
σ
σ
(11.60)
where
erf T
erf T
1
2 +
m 1
m 2
F
(
T
) =
+ (
1
)
(11.61)
σ
σ
/ 2
π 0
t 2
and erf
(
a
)
is the error function defined as erf
(
a
) =
1
exp
(
/
2
)
dt .
For the second neuron, we have
[
m 1
+ (
1
)
m 2 ]
F
(
T
)w
1
w
=
.
(11.62)
2
1
F
(
T
)
Search WWH ::




Custom Search