Digital Signal Processing Reference
In-Depth Information
cessfully applied to medical image processing [60]. In image compression,
VQ provides an ecient technique for data compression. Compression
is achieved by transmitting the index of the code word instead of the
vector itself.
VQ can be defined as a mapping that assigns each vector x =( x 0 ,x 1 ,
,x n−1 ) T in the n -dimensional space R n toacodewordfromafinite
subset of R n . The subset Y =
···
,representing
the set of possible reconstruction vectors is called a codebook of size
M . Its members are called the code words . Note that both the input
space and the codebook have the same dimension and several y i can
be assigned to one class. In the encoding process, a distance measure,
usually Euclidean, is evaluated to locate the closest code word for
each input vector x . Then the address corresponding to the code word
is assigned to x and transmitted. The distortion between the input
vector and its corresponding codeword y is defined by the distance
d ( x , y )=
{
y i
: i =1 , 2 ,
···
,M
}
represents the norm of x .
A vector quantizer achieving a minimum encoding error is referred to
as a Voronoi quantizer . Figure 6.8 shows an input data space partitioned
into four regions, called Voronoi cells, and the corresponding Voronoi
vectors. These regions represent all those input vectors that are very
close to the respective Voronoi vector.
Recent developments in neural network architectures lead to a new
unsupervised data-clustering technique, the learning vector quantization
(LVQ). Its architecture is similar to that of a competitive learning net-
work, with the only exception being that each output unit is associated
with a class. The learning paradigm involves two steps. In the first step,
the closest prototype (Voronoi vector) is located without using class in-
formation, while in the second step, the Voronoi vector is adapted. If
the class of the input vector and the Voronoi vector match, the Voronoi
vector is moved in the direction of the input vector x .Otherwise,the
Voronoi vector w is moved away from this vector x .
||
x
y
||
,where
||
x
||
The LVQ algorithm is simple and is described below.
1. Initialization : Initialize the weight vectors
}
by setting them equal to the first N exemplar input feature vectors
{
{
w j (0)
|
j =1 , 2 ,...,N
x i |
i =1 , 2 ,...,L
}
.
2. Sampling :Drawasample x from the input data; the vector x represents
Search WWH ::




Custom Search