TY - JOUR
T1 - Asymptotic linear convergence of fully-corrective generalized conditional gradient methods
AU - Bredies, Kristian
AU - Carioni, Marcello
AU - Fanzon, Silvio
AU - Walter, Daniel
N1 - Funding Information:
KB and SF gratefully acknowledge support by the Christian Doppler Research Association (CDG) and Austrian Science Fund (FWF) through the Partnership in Research project PIR-27 “Mathematical methods for motion-aware medical imaging” and project P 29192 “Regularization graphs for variational imaging”. MC is supported by the Royal Society (Newton International Fellowship NIF\R1\192048 Minimal partitions as a robustness boost for neural network classifiers). The Department of Mathematics and Scientific Computing, to which KB and SF are affiliated, is a member of NAWI Graz ( https://www.nawigraz.at/en/ ). The authors KB and SF are further members of/associated with BioTechMed Graz ( https://biotechmedgraz.at/en/ ).
Publisher Copyright:
© 2023, The Author(s).
PY - 2024/5
Y1 - 2024/5
N2 - We propose a fully-corrective generalized conditional gradient method (FC-GCG) for the minimization of the sum of a smooth, convex loss function and a convex one-homogeneous regularizer over a Banach space. The algorithm relies on the mutual update of a finite set Ak of extremal points of the unit ball of the regularizer and of an iterate uk∈ cone (Ak) . Each iteration requires the solution of one linear problem to update Ak and of one finite dimensional convex minimization problem to update the iterate. Under standard hypotheses on the minimization problem we show that the algorithm converges sublinearly to a solution. Subsequently, imposing additional assumptions on the associated dual variables, this is improved to a linear rate of convergence. The proof of both results relies on two key observations: First, we prove the equivalence of the considered problem to the minimization of a lifted functional over a particular space of Radon measures using Choquet’s theorem. Second, the FC-GCG algorithm is connected to a Primal-Dual-Active-Point method on the lifted problem for which we finally derive the desired convergence rates.
AB - We propose a fully-corrective generalized conditional gradient method (FC-GCG) for the minimization of the sum of a smooth, convex loss function and a convex one-homogeneous regularizer over a Banach space. The algorithm relies on the mutual update of a finite set Ak of extremal points of the unit ball of the regularizer and of an iterate uk∈ cone (Ak) . Each iteration requires the solution of one linear problem to update Ak and of one finite dimensional convex minimization problem to update the iterate. Under standard hypotheses on the minimization problem we show that the algorithm converges sublinearly to a solution. Subsequently, imposing additional assumptions on the associated dual variables, this is improved to a linear rate of convergence. The proof of both results relies on two key observations: First, we prove the equivalence of the considered problem to the minimization of a lifted functional over a particular space of Radon measures using Choquet’s theorem. Second, the FC-GCG algorithm is connected to a Primal-Dual-Active-Point method on the lifted problem for which we finally derive the desired convergence rates.
KW - Choquet’s theorem
KW - Conditional gradient method
KW - Non-smooth optimization
KW - Sparsity
UR - http://www.scopus.com/inward/record.url?scp=85164776985&partnerID=8YFLogxK
U2 - 10.1007/s10107-023-01975-z
DO - 10.1007/s10107-023-01975-z
M3 - Article
AN - SCOPUS:85164776985
SN - 0025-5610
VL - 205
SP - 135
EP - 202
JO - Mathematical programming
JF - Mathematical programming
IS - 1-2
ER -