Digital Signal Processing Reference
In-Depth Information
Π j,k,i
Π j,x,y
w j,k,i
w j,x,y
j,k,i
j,x,y
(a)
Training order
N j,k,i
N j,k,i+1
Subtraction part for
N j,k,0
the next local GMM.
N j,k+1,0
Overlapped part of two
neighboring local GMMs.
Addition part for
the next local GMM.
(b)
(c)
FIGURE 10.7
(a) The proposed GMF model where
j,k,i and
j,x, y ,respectively. (b) Context structure in LCHMM. The white node represents the hidden state
variable S . The black node denotes the continuous random variable W . The square is the context
node V of
w j,k,i and
w j,x, y are associated with two GMMs,
. (c) The illustration of the fast implementation of the LCHMM training, where the
gray part is the overlapped part of two neighboring local models, and the black and white parts
are the distinct parts.
w
of as an extension of the GMM. GMF assumes that each wavelet coefficient,
w j,k,i , follows a local GMM parameterized by
j,k,i,m
j,k,i
={
p S j,k,i (
m
)
,
σ
|
m
=
0 , 1
}
.
j,k,i can be estimated by the neighborhood of
w j,k,i ,
j,k,i , which is
selected by a square window of 2 C j +
1 and centered at
w j,k,i ,asshown in
Figure 10.7a, i.e.,
j,k,i
={ w j,x, y |
x
=
k
C j ,
...
,k
+
C j ; y
=
i
C j ,
...
,i
+
C j }
. GMF is a highly localized model that exploits the local statistics of the
wavelet coefficients. In particular, it applies to images where the nonstationary
properties are prominent.
Search WWH ::




Custom Search