Mathurin Massias. Fast resolution resolution of structured inverse problems: extrapolation and iterative regularization

This Thursday 14 Jan at 4.pm we will welcome Mathurin Massias post-doctoral researcher at University of Genova who will present his works on structured inverse problem:

Title: Fast resolution resolution of structured inverse problems: extrapolation and iterative regularization

Abstract: Overparametrization is common in linear inverse problems, which poses the question of stability and uniqueness of the solution. A remedy is to select a specific solution by minimizing a bias functional over all interpolating solutions. This functional is frequently neither smooth nor convex (e.g. L1, L2/L1, nuclear norm, TV). In the first part of the talk, we study fast solvers for the so called Tikhonov approach, where the bilevel optimization problem is relaxed into “datafit + regularization” (e.g., the Lasso). We show that, for separable problems arising in ML, coordinate descent algorithms can be accelerated by Anderson extrapolation, which surpasses full gradient methods and inertial acceleration.
The Tikhonov approach can be costly, as it requires to calibrate the regularization strength over a grid of values, and thus to solve many optimization problems. In the second part of the talk, we present results on iterative regularization: a single optimization problem is solved, and the number of iterations acts as the regularizer. We derive novel results on the early stopped iterates, in the case where the bias is convex but not strongly convex.

The presentation is based on:

– https://arxiv.org/abs/2011.10065 Anderson acceleration of coordinate descent
– https://arxiv.org/abs/2006.09859 Iterative regularization for convex regularizers
– https://arxiv.org/abs/1907.05830 Dual extrapolation for sparse generalized linear models

(joint works with Alexandre Gramfort, Joseph Salmon, Samuel Vaiter, Quentin Bertrand, Cesare Molinari, Lorenzo Roscasco and Silvia Villa)