Research
I am a Professor in Statistics and Machine Learning at École Polytechnique (CMAP, Institut Polytechnique de Paris) and scientific co-director of Hi! PARIS.
My work is centered on the mathematical foundations of machine learning: the analysis of stochastic algorithms, the theory of federated and decentralized learning, and uncertainty quantification. A recurring theme is understanding how statistical and computational constraints interact at scale — and developing new tools or guarantees for methods used in practice.
After graduating from ENS Paris, I received my PhD from ENS Paris (in the fantastic Sierra team), supervised by Francis Bach, including a wonderful visiting period at UC Berkeley (Martin Wainwright). I then held a postdoctoral position at EPFL (Martin Jaggi).
I joined École Polytechnique in 2019 as an Assistant Professor, defended my habilitation (HDR) in 2023, and was promoted to Professor the same year. Since 2025, I serve as Scientific Co-Director of Hi! PARIS, our Center on Data Analytics and Artificial Intelligence for Science, Business and Society, created by Institut Polytechnique de Paris (IP Paris) and HEC Paris and joined by Inria (Centre Inria de Saclay).
Stochastic optimization · Federated and decentralized learning · Uncertainty quantification · Missing data · First-order optimization · Statistical learning theory
News
- Mar 2026 🔥 Mini-tutorial on Conformal Prediction at the SIAM Conference on Uncertainty Quantification (UQ26), Minneapolis. Slides →
- Mar 2026 Tutorial on PEPit with Adrien Taylor at SMAI-MODE: computer-assisted worst-case analyses of first-order optimization methods. Tutorial page and materials →
- Mar 2026 Invited seminar tour: Columbia University, University of Pennsylvania, NYU, and Flatiron Institute. Talk on "Optimization & Computer-Assisted Proofs: Applications in Federated Learning".
- Feb 2026 Invited talk at the Simons Institute Workshop on Federated and Collaborative Learning, UC Berkeley.
- Jan 2026 Scalable Utility-Aware Multiclass Calibration (with M. Hegazy, M.I. Jordan) accepted at AISTATS 2026 — spotlight.
- Jan 2026 Tight Analysis of Decentralized SGD: A Markov Chain Perspective (with L. Versini, P. Mangold) accepted at AISTATS 2026.
- Jan 2026 Manu Upadhyaya joins as a postdoctoral researcher at Inria Paris, co-advised with Baptiste Goujaud and Adrien Taylor.
- Jul 2025 Tutorial at Hi! PARIS Summer School on Conformal Prediction and Uncertainty Quantification. Slides →
- Jul 2025 Lucas Versini, Daniel Berg Thomsen, and Abel Douzal join the group.
- May 2025 Three papers accepted at ICML 2025: Achieving optimal breakdown for byzantine robust gossip (with R. Gaucher, H. Hendrikx), Compressed least-squares regression (with C. Philippenko, JMLR→conference track), and SCAFFOLD with stochastic gradients (with P. Mangold, A. Durmus, E. Moulines).
- Jan 2025 Federated Averaging and Richardson-Romberg extrapolation accepted at AISTATS 2025 (with P. Mangold, A. Durmus, S. Samsonov, E. Moulines).
- Nov 2024 PEPit accepted in Mathematical Programming Computation (with B. Goujaud, C. Moucer, F. Glineur, J.M. Hendrickx, A. Taylor).
- 2025 Provable non-accelerations of the heavy-ball method (with B. Goujaud, A. Taylor) published in Mathematical Programming.
- Jul 2024 Tutorial on Conformal Prediction at ICML 2024. Slides →
- Jun 2024 Margaux Zaffran defends her PhD thesis!
- Apr 2024 Two papers accepted at ICML 2024: Random features and naive imputation (with A. Ayme, C. Boyer, E. Scornet) and Sliced-Wasserstein with spherical harmonics (with R. Leluc et al.).
- Apr 2024 Baptiste Goujaud defends his PhD thesis!
- Dec 2023 Two papers accepted at AISTATS 2024: linear mode connectivity via OT (with D. Ferbach, B. Goujaud, G. Gidel) and compression with exact error distribution (with M. Hegazy, R. Leluc, C.T. Li).
- Sep 2023 Renaud Gaucher joins as a PhD student, co-advised with Hadrien Hendrikx.
- May 2023 Two papers at ICML 2023: conformal prediction with missing values (with M. Zaffran, J. Josse, Y. Romano) and naive imputation in high dimensions (with A. Ayme, C. Boyer, E. Scornet).
- Apr 2023 Rémi Leluc, Damien Ferbach, and Mahmoud Hegazy join the group.
- Mar 2023 Defended my habilitation to supervise research (HDR) at École Polytechnique. Manuscript · slides.
- May 2022 Two papers accepted at ICML 2022: Adaptive Conformal Predictions for Time Series (with M. Zaffran, O. Féron, Y. Goude, J. Josse) and Minimax rate of consistency for linear models with missing values (with A. Ayme, C. Boyer, E. Scornet).
- Jan 2022 Three papers at AISTATS 2022: differentially private FL (with M. Noble, A. Bellet), super-acceleration with cyclical step sizes (with B. Goujaud et al.), and QLSD for Bayesian FL (with M. Vono et al.).
- Jan 2022 Released PEPit: a Python package for computer-assisted worst-case analyses in first-order optimization. GitHub · preprint.
- Nov 2021 Two papers accepted at NeurIPS 2021: Federated Expectation Maximization with heterogeneity mitigation (with G. Fort, E. Moulines, G. Robin) and Preserved central model for faster bidirectional compression (with C. Philippenko).
- Sep 2021 Co-organizing the Federated Learning Workshop at Sorbonne Paris Université (with Owkin, Accenture, SFDS).
- Apr 2021 Alexis Ayme joins as a PhD student on learning with missing data, co-advised with C. Boyer and E. Scornet.
- Apr 2021 Maxence Noble joins as a research intern on utility-privacy tradeoffs in federated learning, co-advised with A. Bellet.
- Dec 2020 Margaux Zaffran joins as a PhD student on Electricity Price Prediction (with EDF), co-advised with J. Josse, Y. Goude, O. Féron.
- Oct 2020 Baptiste Goujaud joins as a PhD student on First-Order Optimization, co-advised with Adrien Taylor.
- Sep 2020 Debiasing Stochastic Gradient Descent to handle missing values accepted at NeurIPS 2020 (with A. Sportisse, C. Boyer, J. Josse).
- Jun 2020 On Convergence-Diagnostic based Step Sizes for SGD accepted at ICML 2020 (with S. Pesme, N. Flammarion).
- Jan 2020 Context Mover's Distance & Barycenters accepted at AISTATS 2020 (with S.P. Singh, A. Hug, M. Jaggi).
- Dec 2019 Constantin Philippenko joins as a PhD student on Federated Learning, co-advised with É. Moulines.
- Dec 2019 Two papers at NeurIPS 2019: Unsupervised Scalable Representation Learning for Multivariate Time Series (with J.-Y. Franceschi, M. Jaggi) and Communication trade-offs for distributed SGD with large step size (with K.K. Patel).
Group
I am very fortunate to work with a wonderful group of PhD students and collaborators. If you are interested in joining, feel free to reach out!
Postdoctoral researchers
- Jan 2026– Manu Upadhyaya — postdoc at Inria Paris, co-advised with Baptiste Goujaud and Adrien Taylor.
Ongoing PhD students
- 2023– Renaud Gaucher — co-advised with Hadrien Hendrikx (Inria).
- 2024– Mahmoud Hegazy — co-advised with Michael I. Jordan (UC Berkeley / Inria).
- 2025– Daniel Berg Thomsen — co-advised with Adrien Taylor.
- 2025– Lucas Versini — co-advised with Paul Mangold.
- 2026– Andrei Pantea — CIFRE PhD with EDF, co-advised with Yannig Goude.
Research engineers
- 2024– Deepika Singh — Predictive maintenance on railway systems.
Former PhD students
- 2024 Margaux Zaffran — Conformal prediction and time series. Co-advised with J. Josse, Y. Goude, O. Féron. Now postdoc at UC Berkeley, then Inria Saclay.
- 2024 Baptiste Goujaud — First-order optimization, PEPit. Co-advised with Adrien Taylor. Now Assistant Professor at Télécom SudParis.
- 2024 Alexis Ayme — Missing data and high-dimensional learning. Co-advised with C. Boyer and E. Scornet. Now postdoc at ENS Ulm (CNRS) with Bruno Loureiro.
- 2023 Constantin Philippenko — Federated learning and compression. Now Data & Machine Learning Scientist at AFP.
🏆 While these distinctions are entirely the students' achievements, I am proud that several of them have been recognized with major awards:
- Margaux Zaffran — Jacques Neveu PhD Award (SMAI); Paul Caseau PhD Thesis Award (EDF & National Academy of Technologies of France, Jan 2026); Maths, Businesses & Society PhD Award (AMIES, SFdS, SMAI, SMF); Prix L'Oréal pour les Femmes et la Science.
- Baptiste Goujaud — Best PhD in Mathematics, IP Paris 2025.
Former postdoctoral researchers
- Jean-Baptiste Fest — EM methods in decentralized learning.
- Rémi Leluc — Statistics, optimal transport.
Former research interns
- Scott Pesme — then PhD with Nicolas Flammarion at EPFL.
- Maxence Noble — then PhD with Alain Durmus at IP Paris.
- Damien Ferbach — then PhD with Gauthier Gidel at MILA.
- Kumar Kshitij Patel — then PhD with Nati Srebro at TTIC.