Optimization Algorithm#

Here’s a list of the optimization algorithms used in smash:

  • Step-by-Step (SBS): Global optimization algorithm [Michel, 1989].

  • Limited-memory Broyden-Fletcher-Goldfarb-Shanno Bounded (L-BFGS-B): Quasi-Newton methods for bounded optimization [Zhu et al., 1994].

  • Stochastic Gradient Descent (SGD): Iterative optimization using random mini-batches [Bottou, 2012].

  • Adaptive Moment Estimation (Adam): Adaptive learning rates with momentum for fast convergence [Kingma and Ba, 2014].

  • Adaptive Gradient (Adagrad): Subgradients-based optimization with adaptive learning rates [Duchi et al., 2011].

  • Root Mean Square Propagation (RMSprop): Optimization with squared gradient averaging and adaptive learning rates [Graves, 2013].