Three papers will be presented this year at NeurIPS in New Orleans.
- Abide by the Law and Follow the Flow: Conservation Laws for Gradient Flows with Sibylle Marcotte, Rémi Gribonval and Gabriel Peyré. Oral presentation.
The purpose of this article to expose the definition and basic properties of “conservation laws”, which are maximal sets of independent quantities conserved during gradient flows of a given model, and to explain how to find the exact number of these quantities by performing finite-dimensional algebraic manipulations on the Lie algebra generated by the Jacobian of the model.
- Does a sparse ReLU network training problem always admit an optimum? with Quoc-Tung Le, Elisa Riccietti and Rémi Gribonval.
This work shows that optimization problems involving deep networks with certain sparsity patterns do not always have optimal parameters, and that optimization algorithms may then diverge. Via a new topological relation between sparse ReLU neural networks and their linear counterparts, this article derives an algorithm to verify that a given sparsity pattern suffers from this issue.
- SNEkhorn: Dimension Reduction with Symmetric Entropic Affinities with Hugues Van Assel, Titouan Vayer, Rémi Flamary and Nicolas Courty.
This work uncovers a novel characterization of entropic affinities as an optimal transport problem, allowing a natural symmetrization that can be computed efficiently using dual ascent. The corresponding novel affinity matrix derives advantages from symmetric doubly stochastic normalization in terms of clustering performance, while also effectively controlling the entropy of each row thus making it particularly robust to varying noise levels. Following, it presents a new DR algorithm, SNEkhorn, that leverages this new affinity matrix.