Asymptotic linear convergence of fully-corrective generalized conditional gradient methods

Kristian Bredies, Marcello Carioni, Silvio Fanzon, Daniel Walter*

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

4 Citations (Scopus)
47 Downloads (Pure)

Abstract

We propose a fully-corrective generalized conditional gradient method (FC-GCG) for the minimization of the sum of a smooth, convex loss function and a convex one-homogeneous regularizer over a Banach space. The algorithm relies on the mutual update of a finite set Ak of extremal points of the unit ball of the regularizer and of an iterate uk∈ cone (Ak) . Each iteration requires the solution of one linear problem to update Ak and of one finite dimensional convex minimization problem to update the iterate. Under standard hypotheses on the minimization problem we show that the algorithm converges sublinearly to a solution. Subsequently, imposing additional assumptions on the associated dual variables, this is improved to a linear rate of convergence. The proof of both results relies on two key observations: First, we prove the equivalence of the considered problem to the minimization of a lifted functional over a particular space of Radon measures using Choquet’s theorem. Second, the FC-GCG algorithm is connected to a Primal-Dual-Active-Point method on the lifted problem for which we finally derive the desired convergence rates.

Original languageEnglish
Pages (from-to)135-202
Number of pages68
JournalMathematical programming
Volume205
Issue number1-2
Early online date13 Jul 2023
DOIs
Publication statusPublished - May 2024

Keywords

  • Choquet’s theorem
  • Conditional gradient method
  • Non-smooth optimization
  • Sparsity

Fingerprint

Dive into the research topics of 'Asymptotic linear convergence of fully-corrective generalized conditional gradient methods'. Together they form a unique fingerprint.

Cite this