Abstract
We derive lower bounds on the black-box oracle complexity of large-scale smooth convex minimization problems, with emphasis on minimizing smooth (with Hölder continuous, with a given exponent and constant, gradient) convex functions over high-dimensional -balls, . Our bounds turn out to be tight (up to logarithmic in the design dimension factors), and can be viewed as a substantial extension of the existing lower complexity bounds for large-scale convex minimization covering the nonsmooth case and the “Euclidean” smooth case (minimization of convex functions with Lipschitz continuous gradients over Euclidean balls). As a byproduct of our results, we demonstrate that the classical Conditional Gradient algorithm is near-optimal, in the sense of Information-Based Complexity Theory, when minimizing smooth convex functions over high-dimensional -balls and their matrix analogies–spectral norm balls in the spaces of square matrices.
Original language | English |
---|---|
Pages (from-to) | 1-14 |
Journal | Journal of Complexity |
Volume | 31 |
Issue number | 1 |
DOIs | |
Publication status | Published - Feb 2015 |
Externally published | Yes |
Keywords
- n/a OA procedure