.. _math_num_documentation.optimization_algorithm: ====================== Optimization Algorithm ====================== Here's a list of the optimization algorithms used in `smash`: - Step-by-Step (SBS): Global optimization algorithm :cite:p:`Michel1989`. - Limited-memory Broyden-Fletcher-Goldfarb-Shanno Bounded (L-BFGS-B): Quasi-Newton methods for bounded optimization :cite:p:`zhu1994bfgs`. - Stochastic Gradient Descent (SGD): Iterative optimization using random mini-batches :cite:p:`bottou2012stochastic`. - Adaptive Moment Estimation (Adam): Adaptive learning rates with momentum for fast convergence :cite:p:`kingma2014adam`. - Adaptive Gradient (Adagrad): Subgradients-based optimization with adaptive learning rates :cite:p:`duchi2011adaptive`. - Root Mean Square Propagation (RMSprop): Optimization with squared gradient averaging and adaptive learning rates :cite:p:`graves2013generating`.