Bagging
← Back to Ensemble Methods
Bootstrap Aggregating. Train multiple models on different bootstrap samples (random subsets with replacement) of the training data, then average predictions. Reduces variance without increasing bias. Random Forest is the most famous bagging method.