Data Mining - Stacking

Thomas Bayes

About

Stacking is a ensemble of models combined sequentially.

Stacking uses a “meta learner” (not voting) to combine the predictions of “base learners.”

The base learners (the expert) are not combined by voting but by using a meta-learner, another learner scheme that combines the output of the base learners.

For example create a linear model and then as a second step use random forests (residual errors as a response).

  • Base learners: level-0 model
  • Meta Learner: level-1 model

The predictions of the base learners are input for the meta-learner.

Typically, different machines learning schemes are used as based-learners to get different perspective.

It can’t use predictions on training data to generate data for level‐1 model (Instead use cross‐validation‐like scheme)

Implementation

Weka

Stacking C is the more efficient version (allow multiple level‐0 models by specifying a metaclassifier)

Documentation / Reference





Discover More
Thomas Bayes
Data Mining - Ensemble Learning (meta set)

Combining multiple models into ensemble in order to produce an ensemble for learning. (Committee| Collective Intelligence) decision of different classifier algorithms. Having different classifiers (also...



Share this page:
Follow us:
Task Runner