Optimization Algorithms#
Below is a list of optimization algorithms implemented in smash:
Step-by-Step (SBS): Global optimization algorithm [Michel, 1989].
Nelder-Mead: Derivative-free simplex method for unconstrained optimization [Nelder and Mead, 1965].
Powell: Direction-set method for unconstrained optimization without derivatives [Powell, 1964].
Limited-memory Broyden-Fletcher-Goldfarb-Shanno Bounded (L-BFGS-B): Quasi-Newton methods for bounded optimization [Zhu et al., 1997].
Stochastic Gradient Descent (SGD): Iterative optimization using random mini-batches [Bottou, 2012].
Adaptive Moment Estimation (Adam): Adaptive learning rates with momentum for fast convergence [Kingma and Ba, 2014].
Adaptive Gradient (Adagrad): Subgradients-based optimization with adaptive learning rates [Duchi et al., 2011].
Root Mean Square Propagation (RMSprop): Optimization with squared gradient averaging and adaptive learning rates [Graves, 2013].
The implementations of the Nelder-Mead, Powell, and L-BFGS-B algorithms in smash are based on optimization functions provided by the SciPy library [Virtanen et al., 2020].