Database Reference
In-Depth Information
algorithms are greedy by nature and construct the decision tree in a
top-down, recursive manner (also known as divide and conquer). In each
iteration, the algorithm considers the partition of the training set using the
outcome of discrete input attributes. The selection of the most appropriate
attribute is made according to some splitting measures. After the selection
of an appropriate split, each node further subdivides the training set into
smaller subsets, until a stopping criterion is satisfied.
3.6 Stopping Criteria
The growing phase continues until a stopping criterion is triggered. The
following conditions are common stopping rules:
(1) All instances in the training set belong to a single value of y .
(2) The maximum tree depth has been reached.
(3) The number of cases in the terminal node is less than the minimum
number of cases for parent nodes.
(4) If the node were split, the number of cases in one or more child nodes
would be less than the minimum number of cases for child nodes.
(5) The best splitting criterion is not greater than a certain threshold.
Search WWH ::




Custom Search