Biomedical Engineering Reference
In-Depth Information
- The SOM was developed by professor Kohonen. The SOM has been proven
useful in many applications. It is considered as one of the most popular neural
network models. It belongs to the category of competitive learning networks.
The SOM can be used to detect features inherent to the problem and thus has
also been called the self-organizing feature map (SOFM). The SOM can thus
serve as a cluster analyzing tool of high-dimensional data.
- Independent component analysis Developed extension of PCA is called
independent components analysis. This technique is useful in arriving at an
estimate of the unknown original sources of the signal. Principal component
analysis is commonly used for sorting neuronal spikes (action potentials).
- Cluster analysis is a major technique for classifying a 'mountain' of infor-
mation into manageable meaningful piles. It is a data reduction tool that
creates subgroups that are more manageable than individual datum. Like
factor
analysis,
it
examines
the
full
complement
of
inter-relationships
between variables.
Supervised: In supervised learning, a desired output result for each input
vector is required when the network is trained.
- Linear discriminate analysis is a signal classification technique that directly
maximizes class separately; generating projections where the examples of
each class form compact clusters and the different clusters are far from each
other. Quadratic discriminant analysis (QDA) is closely related to LDA.
- Partial least squares are the ''gold standard'' in chemo metrics due to its
ability to handle collinear data and reduce the number of required calibration
samples.
- Feature subset selection is a dimensionality reduction technique that can be
used to configure small sensor arrays for specific odor-measurement appli-
cations. The goal of FSS is to find an ''optimal'' subset of sensors (or features)
that maximizes information content or predictive accuracy. The simplest FSS
approach consists of evaluating each feature individually and selecting those
features with the highest scores. Unfortunately, this approach ignores feature
redundancy and will rarely find an optimal subset.
• Classifiers:
- Regressions
Principal component regression is an alternative solution to the OLS (ordinary
least square) co-linearity problem is to perform PCA and retain only a few of the
principal components as ''latent variables.''
A MLR analysis is carried out to predict the values of a dependent variable, Y,
when given a set of p explanatory variables (x 1 , x 2 ,…, x p ).
Canonical correlation regression: The CCR estimator is based on a transformation
of the variables in the co-integrating regression that removes the second-order bias
of the OLS estimator.
- Neural sets
Search WWH ::




Custom Search