backpropagation - AshokBhat/ml GitHub Wiki
- Primary algorithm for performing gradient descent on neural networks.
- First, the output values of each node are calculated (and cached) in a forward pass.
- Then, the partial derivative of the error with respect to each parameter is calculated in a backward pass through the graph.
- What is backpropagation?
- When it is used during training?