Asymptotic linear convergence of fully-corrective generalized conditional gradient methods

成果类型:
Article
署名作者:
Bredies, Kristian; Carioni, Marcello; Fanzon, Silvio; Walter, Daniel
署名单位:
University of Graz; University of Twente; University of Hull; Humboldt University of Berlin
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-023-01975-z
发表日期:
2024
页码:
135-202
关键词:
initial data identification simplicial decomposition sparsity algorithms wolfe rates
摘要:
We propose a fully-corrective generalized conditional gradient method (FC-GCG) for the minimization of the sum of a smooth, convex loss function and a convex one homogeneous regularizer over a Banach space. The algorithm relies on the mutual update of a finite set Ak of extremal points of the unit ball of the regularizer and of an iterate uk ? cone(Ak). Each iteration requires the solution of one linear problem to update Ak and of one finite dimensional convex minimization problem to update the iterate. Under standard hypotheses on the minimization problem we show that the algorithm converges sublinearly to a solution. Subsequently, imposing additional assumptions on the associated dual variables, this is improved to a linear rate of convergence. The proof of both results relies on two key observations: First, we prove the equivalence of the considered problem to the minimization of a lifted functional over a particular space of Radon measures using Choquet's theorem. Second, the FC-GCG algorithm is connected to a Primal-Dual-Active-Point method on the lifted problem for which we finally derive the desired convergence rates.