On acceleration with noise-corrupted gradients
Files
Published version
Date
2018
DOI
Authors
Orecchia, Lorenzo
Diakonikolas, Jelena
Cohen, Michael A.
Version
Published version
OA Version
Citation
Lorenzo Orecchia, Jelena Diakonikolas, Michael Cohen. 2018. "On Acceleration with Noise-Corrupted Gradients." Proceedings of the 35th International Conference on Machine Learning (ICML 2018)
Abstract
Accelerated algorithms have broad applications
in large-scale optimization, due to their generality
and fast convergence. However, their stability in
the practical setting of noise-corrupted gradient
oracles is not well-understood. This paper provides two main technical contributions: (i) a new
accelerated method AGD+ that generalizes Nesterov’s AGD and improves on the recent method
AXGD (Diakonikolas & Orecchia, 2018), and (ii)
a theoretical study of accelerated algorithms under noisy and inexact gradient oracles, which is
supported by numerical experiments. This study
leverages the simplicity of AGD+ and its analysis to clarify the interaction between noise and
acceleration and to suggest modifications to the
algorithm that reduce the mean and variance of
the error incurred due to the gradient noise.
Description
License
Copyright 2018 by the author(s).