Feature Importance
← Back to Model Evaluation
Understanding which features contribute most to model predictions. Critical for model interpretability, debugging, and feature selection.
Methods
- SHAP Values — game-theoretic approach, consistent and theoretically grounded
- Permutation Importance — measure accuracy drop when feature is shuffled
- Feature Selection — remove low-importance features to simplify model
Related
- Decision Trees (built-in feature importance)
- Feature Selection (use importance for selection)