Backpropagation
← Back to Neural Network Fundamentals
The algorithm for computing gradients of the loss with respect to each weight in the network using the chain rule. Enables training deep networks by propagating error signals backward from the output to earlier layers.
Key Properties
- Chain rule applied layer by layer
- Forward pass computes outputs; backward pass computes gradients
- Gradients used by optimizers to update weights
Related
- Loss Functions (what backprop optimizes)
- Optimizers (use gradients to update weights)
- Residual Connections (help gradient flow in deep networks)