Ensemble Learning
Ensemble Learning uses multiple Machine Learning Models to obtain better predictive performance than could be obtained from any single Model.
Ensemble Learning Process
The ensemble learning process is an iterative process of using individual models to generate predictive results that contribute to the overall Ensemble result.
Strategy
Strategies for combining individual modeling processes include:
Bagging (Bootstrap Aggregating) - models are trained with aggregates of data from a parent set of data
Boosting - incrementally building an ensemble by training each new model instance to emphasize the training instances that previous models mis-classified
Bayesian Model Averaging - seeks to approximate the Bayes optimal classifier by sampling hypotheses from the hypothesis space, and combining them using the Bayes' theorem
Bucketing - a selection algorithm is used to choose the best model
Stacking - training a model to combine the predictions of several other models
Individual Modeling Process
The individual Modeling Processes use data to train a models which produces results for analysis.
Data
Includes data for:
model training
model testing
model performance tracking
Models
Any Machine Learning model can be used for Ensemble Learning. Model types typically used include:
Results
Results include:
model training accuracies
prediction confidence levels and accuracies
Analysis
Includes analysis of:
data
individual model performance
ensemble results