Treffer: Adaptive high order stochastic descent algorithms

Title:
Adaptive high order stochastic descent algorithms
Publisher Information:
Zenodo
Publication Year:
2022
Collection:
Zenodo
Document Type:
Fachzeitschrift text
Language:
unknown
DOI:
10.5281/zenodo.7257154
Rights:
Creative Commons Attribution Non Commercial No Derivatives 4.0 International ; cc-by-nc-nd-4.0 ; https://creativecommons.org/licenses/by-nc-nd/4.0/legalcode
Accession Number:
edsbas.D6347232
Database:
BASE

Weitere Informationen

Slides presented at the NANMATH 2022 conference, Cluj. Presentation abstract: motivated by statistical learning applications, the stochastic descent optimization algorithms are widely used today to tackle difficult numerical problems. One of the most known among them, the Stochastic Gradient Descent (SGD), has been extended in various ways resulting in Adam, Nesterov, momentum, etc. After a brief introduction to this framework, we introduce in this talk a new approach, called SGD-G2, which is a high order Runge-Kutta stochastic descent algorithm; the procedure allows for step adaptation in order to strike a optimal balance between convergence speed and stability. Numerical tests on standard datasets in machine learning are also presented together with further theoretical extensions.