Mini batch gradient descent - AshokBhat/ml GitHub Wiki
Description
- A type of gradient descent
- During each pass, process N random examples, with N << all the training examples
FAQ about Mini-batch Gradient Descent
- What is it?
- How is it different from others?
- What are the pros and cons?
- Where is it used?
See also
- [Batch gradient descent]] ](/AshokBhat/ml/wiki/[[Stochastic-gradient-descent)