Feature Importance

Back to Model Evaluation

Understanding which features contribute most to model predictions. Critical for model interpretability, debugging, and feature selection.

Methods

  • SHAP Values — game-theoretic approach, consistent and theoretically grounded
  • Permutation Importance — measure accuracy drop when feature is shuffled
  • Feature Selection — remove low-importance features to simplify model

ml feature-importance explainability