Machine Learning - Decision Stump

Thomas Bayes

About

It makes a binary split on one of the attributes. It's considered as weak learner“ because it can only produce a tree with one level as One Rule.

The boosting algorithm AdaBoostM1 utilizes it by default as its base learner.





Discover More
Adaboost Accuracy By Numiterator Boosting
Data Mining - (Boosting|Gradient Boosting|Boosting trees)

Boosting forces new classifiers to focus on the errors produced by earlier ones. boosting works by aggressively reducing the training error Gradient Boosting is an algorithm based on an ensemble of decision...
Thomas Bayes
Data Mining - Decision Tree (DT) Algorithm

Desicion Tree (DT) are supervised Classification algorithms. They are: easy to interpret (due to the tree structure) a boolean function (If each decision is binary ie false or true) Decision trees...



Share this page:
Follow us:
Task Runner