Nonlinear Optimization - skye-git/AU_estimator_doc GitHub Wiki
Methods
Gradient based methods
use first derivatives (gradients) or second derivatives (Hessians) information Algorithms:
- Gradient descent (Steepest descent)
- Newton's method
- Quasi-Newton method (no Hessian required - generalized secant method)
- Gauss-Newton algorithm
- Levenberg-Marquart algorithm (damped least squares)
Algorithms used by Mathematica:
- sequential quadratic programming (SQP) method
- augmented Lagrangian method
- (nonlinear) interior point method
Direct search methods
- Nelder-Mead
- genetic algorithm and differential evolution
- simulated annealing