Database Reference
In-Depth Information
Figure 13-7. Splines to create the three desired outputs from our
cross-validated Decision Tree data mining model.
9) Run the model. The ExampleSet ( tra port) and Tree ( mod port) tabs will be familiar to you.
The PerformanceVector ( avg port) is new, and in the context of Evaluation and
Deployment, this tab is the most interesting to us. We see that using this training data set
and Decision Tree algorithm (gain_ratio), RapidMiner calculates a 54% accuracy rate for
this model. This overall accuracy rate reflects the class precision rates for each possible
value in our eReader_Adoption attribute. For pred. Late Majority as an example, the class
precision (or true positive rate) is 69.8%, leaving us with a 30.2% false positive rate for this
value. If all of the possible eReader_Adoption values had true positive class precisions of
69.8%, then our model's overall accuracy would be 69.8% as well, but they don't—some
are lower, and so when they are weighted and averaged, our model's overall accuracy is
only 54%.
Search WWH ::




Custom Search