A Continuized View on Nesterov Acceleration for Stochastic Gradient Descent and Randomized Gossip. (arXiv:2106.07644v1 [math.OC])

We introduce the continuized Nesterov acceleration, a close variant of
Nesterov acceleration whose variables are indexed by a continuous time
parameter. The two variables continuously mix following a linear ordinary
differential equation and take gradient steps at random times. This continuized
variant benefits from the best of the continuous and the discrete frameworks:
as a continuous process, one can use differential calculus to analyze
convergence and obtain analytical expressions for the parameters; and a
discretization of the continuized process can be computed exactly with
convergence rates similar to those of Nesterov original acceleration. We show
that the discretization has the same structure as Nesterov acceleration, but
with random parameters. We provide continuized Nesterov acceleration under
deterministic as well as stochastic gradients, with either additive or
multiplicative noise. Finally, using our continuized framework and expressing
the gossip averaging problem as the stochastic minimization of a certain energy
function, we provide the first rigorous acceleration of asynchronous gossip
algorithms.

Source: https://arxiv.org/abs/2106.07644

webmaster

Related post