Result: Acceleration by stepsize hedging: Silver Stepsize Schedule for smooth convex optimization.
Further Information
We provide a concise, self-contained proof that the Silver Stepsize Schedule proposed in our companion paper directly applies to smooth (non-strongly) convex optimization. Specifically, we show that with these stepsizes, gradient descent computes an ε -minimizer in O (ε - log ρ 2) = O (ε - 0.7864) iterations, where ρ = 1 + 2 is the silver ratio. This is intermediate between the textbook unaccelerated rate O (ε - 1) and the accelerated rate O (ε - 1 / 2) due to Nesterov in 1983. The Silver Stepsize Schedule is a simple explicit fractal: the i-th stepsize is 1 + ρ ν (i) - 1 where ν (i) is the 2-adic valuation of i. The design and analysis are conceptually identical to the strongly convex setting in our companion paper, but simplify remarkably in this specific setting. [ABSTRACT FROM AUTHOR]
Copyright of Mathematical Programming is the property of Springer Nature and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
Full text is not displayed to guests. Login for full access.