Lower Bounds for Parallel and Randomized Convex Optimization

Jelena Diakonikolas, Cristóbal Andrés Guzmán Paredes

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

11 Citations (Scopus)
75 Downloads (Pure)

Abstract

We study the question of whether parallelization in the exploration of the feasible set can be used to speed up convex optimization, in the local oracle model of computation. We show that the answer is negative for both deterministic and randomized algorithms applied to essentially any of the interesting geometries and nonsmooth, weakly-smooth, or smooth objective functions. In particular, we show that it is not possible to obtain a polylogarithmic (in the sequential complexity of the problem) number of parallel rounds with a polynomial (in the dimension) number of queries per round. In the majority of these settings and when the dimension of the space is polynomial in the inverse target accuracy, our lower bounds match the oracle complexity of sequential convex optimization, up to at most a logarithmic factor in the dimension, which makes them (nearly) tight. Prior to our work, lower bounds for parallel convex optimization algorithms were only known in a small fraction of the settings considered in this paper, mainly applying to Euclidean (ℓ2) and ℓ∞ spaces. Our work provides a more general and streamlined approach for proving lower bounds in the setting of parallel convex optimization.
Original languageEnglish
Title of host publicationProceedings of the Thirty-Second Conference on Learning Theory
Publication statusPublished - 2019
Externally publishedYes
Event32nd Conference on Learning Theory, COLT 2019 - Phoenix, United States
Duration: 25 Jun 201928 Jun 2019
Conference number: 32

Publication series

NameProceedings of Machine Learning Research
Volume99

Conference

Conference32nd Conference on Learning Theory, COLT 2019
Abbreviated titleCOLT 2019
Country/TerritoryUnited States
CityPhoenix
Period25/06/1928/06/19

Fingerprint

Dive into the research topics of 'Lower Bounds for Parallel and Randomized Convex Optimization'. Together they form a unique fingerprint.

Cite this