Stochastic Gradient Descent - rugbyprof/5443-Data-Mining GitHub Wiki
What is Stochastic Gradient Descent?
Stochastic gradient descent (often shortened to SGD), also known as incremental gradient descent, is a stochastic approximation of the gradient descent optimization and iterative method for minimizing an objective function that is written as a sum of differentiable functions. In other words, SGD tries to find minima or maxima by iteration.
Reference: https://en.wikipedia.org/wiki/Stochastic_gradient_descent