Treffer: Stochastic incremental mirror descent algorithms with Nesterov smoothing

Title:
Stochastic incremental mirror descent algorithms with Nesterov smoothing
Contributors:
Optimisation et commande (OC), Unité de Mathématiques Appliquées (UMA), École Nationale Supérieure de Techniques Avancées (ENSTA Paris), Institut Polytechnique de Paris (IP Paris)-Institut Polytechnique de Paris (IP Paris)-École Nationale Supérieure de Techniques Avancées (ENSTA Paris), Institut Polytechnique de Paris (IP Paris)-Institut Polytechnique de Paris (IP Paris), Chemnitz University of Technology / Technische Universität Chemnitz
Publisher Information:
CCSD, 2021.
Publication Year:
2021
Collection:
collection:UMA_ENSTA
collection:TDS-MACS
collection:IP_PARIS
collection:ENSTA-PARIS
collection:ENSTA
collection:DEPARTEMENT-DE-MATHEMATIQUES
collection:IP-PARIS-MATHEMATIQUES
Subject Geographic:
Original Identifier:
HAL: hal-04005441
Document Type:
Konferenz conferenceObject<br />Conference papers
Language:
English
Accession Number:
edshal.hal.04005441v1
Database:
HAL

Weitere Informationen

We propose a stochastic incremental mirror descent method constructed by means of the Nesterov smoothing for minimizing a sum of finitely many proper, convex and lower semicontinuous functions over a nonempty closed convex set in a Euclidean space. The algorithm can be adapted in order to minimize (in the same setting) a sum of finitely many proper, convex and lower semicontinuous functions composed with linear operators. Another modification of the scheme leads to a stochastic incremental mirror descent Bregman-proximal scheme with Nesterov smoothing for minimizing the sum of finitely many proper, convex and lower semicontinuous functions with a prox-friendly proper, convex and lower semicontinuous function in the same framework. Different to the previous contributions from the literature on mirror descent methods for minimizing sums of functions, we do not require these to be (Lipschitz) continuous or differentiable. Applications in Logistics, Tomography and Machine Learning modelled as optimization problems illustrate the theoretical achievements.