Reference List of All Optimizers in the PyPop Library  EvolutionaryIntelligence/pypop Wiki
“It is both enjoyable and educational to hear the ideas directly from the creators”.
From Hennessy, J.L. and Patterson, D.A., 2019.
Computer architecture: A quantitative approach (Sixth Edition). Elsevier.
Here we thank all the authors/coders very much for making their source code available, which make this library much easier to develop and compare.
Evolution Strategy (ES)
MAES

Beyer, H.G. and Sendhoff, B., 2017. Simplify your covariance matrix adaptation evolution strategy. IEEE Transactions on Evolutionary Computation, 21(5), pp.746759.

Beyer, H.G., 2020, July. Design principles for matrix adaptation evolution strategies. In Proceedings of Genetic and Evolutionary Computation Conference Companion (pp. 682700).
Particle Swarm Optimizer (PSO)
The Particle Swarm Optimizer (PSO) class is the base class of all PSO variants, including Particle Swarm Optimizer with Global Topology (PSOGT), Particle Swarm Optimizer with Ring Topology (PSORT), Comprehensive Learning Particle Swarm Optimizer (CLPSO).
 Kennedy, J. and Eberhart, R., 1995, November. Particle swarm optimization. In Proceedings of International Conference on Neural Networks (Vol. 4, pp. 19421948). IEEE.
 Shi, Y. and Eberhart, R., 1998, May. A modified particle swarm optimizer. In IEEE World Congress on Computational Intelligence (pp. 6973). IEEE.
CLPSO
 Liang, J.J., Qin, A.K., Suganthan, P.N. and Baskar, S., 2006. Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Transactions on Evolutionary Computation, 10(3), pp.281295.
 https://github.com/PNSuganthan/CODES/blob/master/2006IEEETECCLPSO.zip (thanks to Suganthan, P.N. for making its MATLAB source code open)
Simulated Annealing (SA)
The Simulated Annealing (SA) class is the base class of all SA variants, including the default SA version from [Corana et al., 1987] (SA) and Enhanced Simulated Annealing from [Siarry et al., 1997] (ESA).
 Kirkpatrick, S., Gelatt, C.D. and Vecchi, M.P., 1983. Optimization by simulated annealing. Science, 220(4598), pp.671680.
SA
 Corana, A., Marchesi, M., Martini, C. and Ridella, S., 1987. Minimizing multimodal functions of continuous variables with the “simulated annealing” algorithm. ACM Transactions on Mathematical Software, 13(3), pp.262280.
ESA
 Siarry, P., Berthiau, G., Durdin, F. and Haussy, J., 1997. Enhanced simulated annealing for globally minimizing functions of manycontinuous variables. ACM Transactions on Mathematical Software, 23(2), pp.209228.
 Hwang, D., Rust, A.G., Ramsey, S., Smith, J.J., Leslie, D.M., Weston, A.D., De Atauri, P., Aitchison, J.D., Hood, L., Siegel, A.F. and Bolouri, H., 2005. A data integration methodology for systems biology. Proceedings of the National Academy of Sciences, 102(48), pp.1729617301.
 Kakadiaris, I.A., Passalis, G., Toderici, G., Murtuza, M.N., Lu, Y., Karampatziakis, N. and Theoharis, T., 2007. Threedimensional face recognition in the presence of facial expressions: An annotated deformable model approach. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(4), pp.640649.
 Passalis, G., Perakis, P., Theoharis, T. and Kakadiaris, I.A., 2011. Using facial symmetry to handle pose variations in realworld 3D face recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 33(10), pp.19381951.
Random (Stochastic) Search (RS)
The Random Search (RS) class is the base class of all individualbased stochastic search optimizers, including Pure Random Search (PRS), Random Hill Climber (RHC).
 Brooks, S.H., 1958. A discussion of random methods for seeking maxima. Operations Research, 6(2), pp.244251.
“While pure random search (the Monte Carlo method), the simplest of all optimization techniques,
is universally applicable,
it is also much too inefficient to be taken seriously.”
From Preuss, M., 2015. Multimodal optimization by means of evolutionary algorithms.
Springer International Publishing.
PRS
(Pure Random Search)
 Fogel, D.B., 1997. Chapter A1.1: Introduction. Handbook of Evolutionary Computation. Oxford University Press. [Baseline: pure random search]
 Zabinsky, Z.B., 2003. Pure random search and pure adaptive search. In Stochastic Adaptive Search for Global Optimization (pp. 2554). Springer, Boston, MA.
 Gomez, F., Schmidhuber, J. and Miikkulainen, R., 2008. Accelerated neural evolution through cooperatively coevolved synapses. Journal of Machine Learning Research, 9(31), pp.937965. [Baseline: a.k.a. random weight guessing]
 Mouret, J.B. and Clune, J., 2015. Illuminating search spaces by mapping elites. arXiv preprint arXiv:1504.04909. [Baseline: a.k.a. random sampling]
 Towfighi, S., 2020. PyGOURGSglobal optimization of nary tree representable problems using uniform random global search. Journal of Open Source Software, 5(47), p.2074.
 Fontaine, M. and Nikolaidis, S., 2021, July. A quality diversity approach to automatically generating humanrobot interaction scenarios in shared autonomy. In Proceedings of Robotics: Science and Systems (Vol. 17). [Baseline: aka standard Monte Carlo simulation]
RHC
 https://github.com/pybrain/pybrain/blob/master/pybrain/optimization/hillclimber.py
 Mühlenbein, H., Schomisch, M. and Born, J., 1991. The parallel genetic algorithm as function optimizer. Parallel Computing, 17(67), pp.619632.
 Goldberg, D.E., 1994. Genetic and evolutionary algorithms come of age. Communications of the ACM, 37(3), pp.113120.
 Levine, D., 1997. Commentary—Genetic algorithms: A practitioner's view. INFORMS Journal on Computing, 9(3), pp.256259.
 Silva, L., Bellon, O.R.P. and Boyer, K.L., 2005. Precision range image registration using a robust surface interpenetration measure and enhanced genetic algorithms. IEEE Transactions on Pattern Analysis and Machine Intelligence, 27(5), pp.762776.
 Shah, D.S., Powers, J.P., Tilton, L.G., Kriegman, S., Bongard, J. and KramerBottiglio, R., 2021. A soft robot that adapts to environments through shape change. Nature Machine Intelligence, 3(1), pp.5159.