The BuildSettings.setAlgorithmSettings method is used to specify
algorithm settings. JDM defines the algorithm settings interfaces
under the javax.datamining.algorithm package. Table 9-11 shows the
list of classification algorithm interfaces supported by JDM.
In Listing 9-9, we illustrate the creation of a simple decision tree
algorithm settings object, which specifies the maximum allowed
depth of the tree as 10, minimum node size as 5, and the tree homo-
geneity metric as gini , as shown in lines 6 to 12. Refer to the JDM API
documentation [JDM11 2006] for a complete listing of available algo-
rithm settings methods.
javax.datamining.algorithm interfaces for classification function
Decision Tree: javax.datamining.algorithm.tree package
A TreeSettings object encapsulates decision tree algorithm-specific settings,
such as maximum tree depth, minimum node size, homogenity metric, etc.
TreeHomogeneityMetric enumerates the various types of goodness measures
of a split.
TreeSelectionMethod enumerates the types of methodologies used for choosing
the best tree along the pruning path.
Naïve Bayes: javax.datamining.algorithm.naivebase
A NaiveBayesSettings object encapsulates naïve bayes algorithm-specific
settings, such as singleton and pairwise thresholds.
Support Vector Machine (SVM): javax.datamining.algorithm.svm.classification
A SVMClassificationSettings object encapsulates Support Vector Machine
(SVM) algorithm-specific settings used for classification function, such as
type of kernal function, complexity factor, etc.
Feed Forward NeuralNet: javax.datamining.algorithm.feedforwardneuralnet
A FeedForwardNeuralNetSettings object captures the parameters associated
with a neural network algorithm, such as the type of learning algorithm,
hidden neural layers, maximum number of iterations, etc.
A NeuralLayer object captures the parameters required to describe a layer in a
neural network model, such as activation function, number of nodes, bias, etc.
A Backpropagation object specifies the parameters used by the backpropaga-
tion learning algorithm, such as learning rate, momentum, etc.
The enumeration ActivationFunction indicates the type of activation function
used by the neural layer.