I am a Ph.D. student at Ecole Normale Supérieure (ENS Paris). I work with Gabriel Peyré and Mathieu Blondel on building and studying new deep learning models. I was a Student Researcher at Google DeepMind from September 2022 to March 2023.

I graduated from École Polytechnique in 2020 and have a master degree from ENS Paris-Saclay in mathematics, vision and learning (MVA), as well as a master degree from Sorbonne Université in mathematics (modeling).

Research interests

My main research interests are at the intersection between deep learning, dynamical systems, optimal transport and differentiable learning. I am particularly interested in the neural ODE framework as a tool to design and study new deep architectures, among which ResNets and Transformers. I am also interested in the convergence of the hidden states trajectories of ResNets to the continuous one of Neural ODEs.


  • Michael E. Sander, Raja Giryes, Taiji Suzuki, Mathieu Blondel, Gabriel Peyré. How do Transformers perform In-Context Autoregressive Learning? Preprint

  • Pierre Marion, Yu-Han Wu, Michael E. Sander, Gérard Biau. Implicit regularization of deep residual networks towards neural ODEs. ICLR, 2024 (Spotlight). Paper

  • Michael E. Sander, Tom Sander, Maxime Sylvestre. Unveiling the secrets of paintings: deep neural networks trained on high-resolution multispectral images for accurate attribution and authentication. QCAV, 2023.

  • Michael E. Sander, Joan Puigcerver, Josip Djolonga, Gabriel Peyré, Mathieu Blondel. Fast, Differentiable and Sparse Top-k: a Convex Analysis Perspective. ICML, 2023. Paper

  • Michael E. Sander, Pierre Ablin, Gabriel Peyré. Do Residual Neural Networks discretize Neural Ordinary Differential Equations? NeurIPS, 2022. Paper, GitHub

  • Samy Jelassi, Michael E. Sander, Yuanzhi Li. Vision Transformers provably learn spatial structure. NeurIPS, 2022. Paper

  • Michael E. Sander, Pierre Ablin, Mathieu Blondel, Gabriel Peyré. Sinkformers: Transformers with Doubly Stochastic Attention. AISTATS, 2022. Paper, GitHub, short presentation

  • Michael E. Sander, Pierre Ablin, Mathieu Blondel, Gabriel Peyré. Momentum Residual Neural Networks. ICML, 2021. Paper, GitHub, short presentation