About

I am a Ph.D. student at Ecole Normale Supérieure (ENS Paris) and a Student Researcher at Google Brain. I work with Gabriel Peyré and Mathieu Blondel on building and studying new deep learning models.

I graduated from École Polytechnique in 2020 and have a master degree from ENS Paris-Saclay in mathematics, vision and learning (MVA), as well as a master degree from Sorbonne Université in mathematics (modeling).

Research interests

My main research interests are at the intersection between deep learning, dynamical systems, optimal transport and differentiable learning. I am particularly interested in the neural ODE framework as a tool to design and study new deep architectures, among which ResNets and Transformers. I am also interested in the convergence of the hidden states trajectories of ResNets to the continuous one of Neural ODEs.

Publications

  • Michael E. Sander, Pierre Ablin, Gabriel Peyré. Do Residual Neural Networks discretize Neural Ordinary Differential Equations? NeurIPS, 2022. Paper, GitHub

  • Samy Jelassi, Michael E. Sander, Yuanzhi Li. Vision Transformers provably learn spatial structure. NeurIPS, 2022. Paper

  • Michael E. Sander, Pierre Ablin, Mathieu Blondel, Gabriel Peyré. Sinkformers: Transformers with Doubly Stochastic Attention. AISTATS, 2022. Paper, GitHub, short presentation

  • Michael E. Sander, Pierre Ablin, Mathieu Blondel, Gabriel Peyré. Momentum Residual Neural Networks. ICML, 2021. Paper, GitHub, short presentation