22-24 September 2021
Alpen-Adria-Universität Klagenfurt
Europe/Vienna timezone

Beating the Saturation Phenomenon of Stochastic Gradient Descent

24 Sep 2021, 13:00
HS 1 (Alpen-Adria-Universität Klagenfurt)

HS 1

Alpen-Adria-Universität Klagenfurt


Zehui Zhou (Department of Mathematics, The Chinese University of Hong Kong)


Stochastic gradient descent (SGD) is a promising method for solving large-scale inverse problems, due to its excellent scalability with respect to data size. The current mathematical theory in the lens of regularization theory predicts that SGD with a polynomially decaying stepsize schedule may suffer from an undesirable saturation phenomenon, i.e., the convergence rate does not further improve with the solution regularity index when it is beyond a certain range. In this talk, I will present our recent results on beating this saturation phenomenon:
(i) (By using small initial stepsize.) We derive a refined convergence rate analysis of SGD, which shows that saturation actually does not occur if the initial stepsize of the schedule is sufficiently small.
(ii) (By using Stochastic variance reduced gradient (SVRG), a popular variance reduction technique for SGD.) We prove that, for a suitable constant step size schedule, SVRG can achieve an optimal convergence rate in terms of the noise level (under suitable regularity condition), which means the saturation does not occur.

Primary authors

Bangti Jin (Department of Computer Science, University College London) Zehui Zhou (Department of Mathematics, The Chinese University of Hong Kong) Jun Zou (Department of Mathematics, The Chinese University of Hong Kong)

Presentation Materials

There are no materials yet.