Geography Reference
In-Depth Information
The strong benefit of the MLC algorithm is its applying for well-developed
probability theory. If it is true that the class likelihood degree functions are
Gaussian, then MLC is the best classifier which reduces the overall chance of error
(Liu et al. 2002 ). Benediktsson et al. ( 1990 ) noticed that even for data which have
not got a normal distribution, the MLC produced a better classification result,
although it has also serious known faults under specific situations. Firstly, if the
histogram/frequency distribution of the image data does not ensure the normal
distribution, the essential idea of this classifier is violated and presents poor or
confusing results. Secondly, the computational cost needed to classify each pixel
(data with a large number of spectral bands, or data containing many spectral
classes to be distinguished) is at issue. The computing cost increases in con-
junction to the square of the applied features channels (Benediktsson et al. 1990 ).
Thirdly, the algorithm works acceptably for relatively low spatial resolution data
with a limited number of bands, but it may not be acceptable for the high reso-
lution and/or high dimensionality data sets, which tend to increase the within-class
variability. This means that the volume of feature space occupied by each class is
extended and increases the risk of class overlap in feature space (Qiu and Jensen
2004 ). Fourthly, the relationship between sample size and the number of features
impacts the assessments of mean vector and variance-covariance matrix. Also,
inadequate ground truth data may present a false assessment of the mean vector
and the variance-covariance matrix of population (poor classification results).
Fifthly, in case of high correlation between two spectral bands (LANDSAT-data),
or when the training samples used for signature generation are not adequately
homogeneous, the covariance matrix becomes unstable. This can be overcome
through the use of other robust statistical method (e.g., PCA) before proceeding to
classification (Albertz 2009 ). Sixthly, an inherent weakness of MLC is that the
subset of features applied in classification is not necessarily to be the optimal
selection for all classes (Swain and Hauska 1977 ). Finally, when auxiliary data is
integrated into a classification process, the assumptions of MLC
cannot be
confirmed.
There have been a number of researchers who have MLC, such as (Brisco and
Brown 1995 ; Huang et al. 2007 ). MLC can be used with multi-source data with
separate scales of measurement (Arora and Mathur 2001 ), while a parametric
MLC, which is commonly used for pixel-based hard classifications, can be used to
segment imagery (e.g., Geneletti and Gorte 2003 ) or expand to a fuzzy classifi-
cation idea (e.g., Schowengerdt 1996 ).
ANN
Humans are good pattern recognizers. This tenet has given researchers in the field
of pattern recognition the basic concept to examine whether computer systems
based on a simplified model of the human mind can simulate the real world, and
whether better overall accuracies can be given compared to traditional statistical
approaches. The Artificial Neural Networks (ANN) algorithm is an example of
Search WWH ::




Custom Search