Software Engineering KB

Home

❯

09 Machine Learning and AI

❯

01 Deep Learning

❯

01 Concept

❯

Batch Normalization

Batch Normalization

Feb 10, 20261 min read

  • deep-learning
  • normalization
  • batch-norm

Batch Normalization

← Back to Neural Network Fundamentals

Normalizes activations within a mini-batch to have zero mean and unit variance. Stabilizes training, allows higher learning rates, and acts as mild regularization. Widely used in CNNs.

Related

  • Layer Normalization (alternative for Transformers)
  • Dropout (another regularization technique)

deep-learning normalization batch-norm


Graph View

  • Batch Normalization
  • Related

Backlinks

  • Neural Network Fundamentals
  • Dropout
  • Layer Normalization

Created with Quartz v4.5.2 © 2026

  • GitHub