Alternating randomized block coordinate descent
Files
Published version
Date
2018
DOI
Authors
Orecchia, Lorenzo
Diakonikolas, Jelena
Version
Published version
OA Version
Citation
Lorenzo Orecchia, Jelena Diakonikolas. 2018. "Alternating Randomized Block Coordinate Descent." Proceedings of the 35th International Conference on Machine Learning (ICML 2018). Available on: https://arxiv.org/abs/1805.09185
Abstract
Block-coordinate descent algorithms and alternating minimization methods are fundamental optimization algorithms and an important primitive in large-scale optimization and machine learning. While various block-coordinate-descent-type methods have been studied extensively, only alternating minimization -- which applies to the setting of only two blocks -- is known to have convergence time that scales independently of the least smooth block. A natural question is then: is the setting of two blocks special?
We show that the answer is "no" as long as the least smooth block can be optimized exactly -- an assumption that is also needed in the setting of alternating minimization. We do so by introducing a novel algorithm AR-BCD, whose convergence time scales independently of the least smooth (possibly non-smooth) block. The basic algorithm generalizes both alternating minimization and randomized block coordinate (gradient) descent, and we also provide its accelerated version -- AAR-BCD. As a special case of AAR-BCD, we obtain the first nontrivial accelerated alternating minimization algorithm.
Description
License
Copyright 2018 by the author(s).