Treffer: Restart Strategies Enabling Automatic Differentiation for Hyperparameter Tuning in Inverse Problems

Title:
Restart Strategies Enabling Automatic Differentiation for Hyperparameter Tuning in Inverse Problems
Contributors:
Davy, Leo
Source:
2024 32nd European Signal Processing Conference (EUSIPCO). :1811-1815
Publisher Information:
IEEE, 2024.
Publication Year:
2024
Document Type:
Fachzeitschrift Article<br />Conference object
File Description:
application/pdf
DOI:
10.23919/eusipco63174.2024.10715077
Rights:
STM Policy #29
CC BY NC
Accession Number:
edsair.doi.dedup.....3eb49ef60f0f406baaba532f9898cda4
Database:
OpenAIRE

Weitere Informationen

Numerous signal/image processing tasks can be formulated as variational problems, whose solutions depend, often crucially, on the values of hyperparameters. Their automated selection usually involves the computation of gradients of a well chosen loss function, which often turns unfeasible analytically. The deep-learning inspired use of automatic differentiation to compute such gradients, though appealing, is significantly impaired by the usually large number of iterations inherently attached to functional minimization in variational problems. The present work proposes and assesses the use of a restart strategy for automated hyperparameter tuning, combining the benefits of automatic differentiation with properties of proximal iterative algorithms. It studies theoretically its conditions of applicability in a generic algorithmic framework and its specification to accelerated Chambolle-Pock iterations when dealing with strongly convex objective function. The effectiveness is illustrated for image denoising and texture segmentation problems.