Backpropagation - AshokBhat/ml GitHub Wiki
About
- Primary algorithm for performing gradient descent on neural networks.
Process
- First, the output values of each node are calculated (and cached) in a forward pass.
- Then, the partial derivative of the error with respect to each parameter is calculated in a backward pass through the graph.
FAQ
- What is backpropagation?
- When it is used during training?
See also
- Gradient descent
- [Batch]] ](/AshokBhat/ml/wiki/[Batch-size) | [Mini-Batch]] | [Forward pass]] ](/AshokBhat/ml/wiki/[[Backpropagation)
- Chain rule