Dropout
← Back to Neural Network Fundamentals
Regularization technique that randomly sets neuron outputs to zero during training with probability p (typically 0.1-0.5). Forces the network to learn redundant representations and prevents co-adaptation of neurons. Disabled during inference.
Related
- Regularization (broader regularization concept)
- Batch Normalization (also provides regularization)