Hyperparameter Tuning
← Back to Model Evaluation
Systematically searching for the best model configuration. Hyperparameters are settings not learned from data (learning rate, tree depth, regularization strength).
Methods
- Grid Search — exhaustive search over parameter grid
- Random Search — random sampling, often more efficient than grid
- Bayesian Optimization — model the objective function, guided search (Optuna, Hyperopt)
Related
- Train-Validation-Test Split (uses validation performance)
- Bayesian Methods (Bayesian optimization)