In this talk, we propose and analyze a generalized conditional gradient method for infinite dimensional variational inverse problems written as the sum of a smooth, convex loss function and a, possibly non-smooth, convex regularizer.
Our method relies on the mutual update of a sequence of extremal points of the unit ball of the regularizer and a sparse iterate given as a suitable linear combination of such extreme points.
We show that under standard hypotheses on the minimization problem, our algorithm converges sublinearly to a solution of the inverse problem. Moreover, we demonstrate that by imposing additional assumptions on the structure of the minimizers, the associated dual variables and the nondegeneracy of the problem, we can improve such convergence result to a linear rate.
Then we apply our generalized conditional gradient method to solve dynamic inverse problems regularized with the Benamou-Brenier energy. Relying on recent results about the characterization of the extremal points for the ball of the Benamou-Brenier energy, we show that our algorithm can be applied to this specific example to reconstruct the motion of heavily undersampled dynamic data together with the presence of noise.