Batch Normalization
← Back to Neural Network Fundamentals
Normalizes activations within a mini-batch to have zero mean and unit variance. Stabilizes training, allows higher learning rates, and acts as mild regularization. Widely used in CNNs.
Related
- Layer Normalization (alternative for Transformers)
- Dropout (another regularization technique)