Software Engineering KB

Home

❯

09 Machine Learning and AI

❯

01 Deep Learning

❯

01 Concept

❯

Dropout

Dropout

Feb 10, 20261 min read

  • deep-learning
  • dropout
  • regularization

Dropout

← Back to Neural Network Fundamentals

Regularization technique that randomly sets neuron outputs to zero during training with probability p (typically 0.1-0.5). Forces the network to learn redundant representations and prevents co-adaptation of neurons. Disabled during inference.

Related

  • Regularization (broader regularization concept)
  • Batch Normalization (also provides regularization)

deep-learning dropout regularization


Graph View

  • Dropout
  • Related

Backlinks

  • Neural Network Fundamentals
  • Batch Normalization

Created with Quartz v4.5.2 © 2026

  • GitHub