
Aymeric Dieuleveut
Professor
Palaiseau, France
Batiment Alan Turing,
Office 2143
I am a Professor in Statistics and Learning at École Polytechnique, Palaiseau, in the applied mathematics department.
My main research interests are statistics, optimization, stochastic approximation, Federated Learning, high-dimensional learning, non-parametric statistics, scalable kernel methods.
Before that, I was postdoctoral Researcher at EPFL (Ecole Polytechnique Fédérale de Lausanne), in the MLO team, directed by Martin Jaggi. Before that, I was Ph.D. student in the Sierra Team, which is part of the DI/ENS (Computer Science Department of École Normale Supérieure). I was supervised by Francis Bach. I graduated from Ecole Normale Supérieure de Paris (Ulm) in 2014 and got a Master's Degree in Mathematics, Probability and Statistics (at Université Paris-Sud, Orsay).
From March to August 2016, I was a visiting scholar researcher at University of California Berkeley, under the supervision of Martin Wainwright.
News!
09/24: Lecture in Orsay - Please Register HERE
07/24: ICML tutorial on Conformal Prediction!
06/24: Margaux Zaffran has successfully defended her PhD thesis!
04/24: Mahmoud Hegazy is starting his internship and PhD! Mahmoud is supervised by Michael Jordan and myself.
04/24: Baptiste Goujaud has successfully defended his PhD thesis!
04/24: Two papers have been accepted to ICML 2024:
- Random features models: a way to study the success of naive imputation. with A Ayme, C Boyer, E Scornet
- Sliced-Wasserstein Estimation with Spherical Harmonics as Control Variates with R Leluc, A Dieuleveut, F Portier, J Segers, A Zhuman
12/23: Two papers have been accepted to Aistats 2024! Congratulations to Damien Ferbach and Mahmoud for the great work.
- Proving linear mode connectivity of neural networks via optimal transport. with D Ferbach, B Goujaud, G Gidel, A Dieuleveut
- Compression with Exact Error Distribution for Federated Learning with M Hegazy, R Leluc, CT Li, A Dieuleveut
09/23: Renaud Gaucher is starting his PhD! Renaud will be working on Decentralized Optimization, within the REDEEM PEPR project. He is supervised by Hadrien Hendrikx and myself.
05/23: Two papers have just been accepted to ICML 2023! Congratulations to Margaux Zaffran and Alexis Ayme for their second PhD papers!
- Conformal Prediction with Missing Values. with M. Zaffran, J. Josse, Y. Romano
- Naive imputation implicitly regularizes high-dimensional linear models with A. Ayme, C. Boyer, E. Scornet
04/23: Welcome to Renaud Gaucher (PhD candidate), Rémi Leluc (Postdoc), Damien Ferbach (Intern), and Mahmoud Hegazy (Intern), that are joining my group for the next months (or years)!
04/23: I successfully defended my Accreditation to supervise research. Many thanks to the reviewers and the jury.
4/23 A couple of new preprints are available: a survey on stochastic approximation methods, beyond the gradient case, and a paper on a constructive approach to build counter-examples in first-order optimization. Links below!
01/23: Lecture in Orsay : lecture page.
12/22: We are looking for a PhD student with Hadrien Hendrikx. More details!
12/22: I am looking for a postdoc on Federated Learning! If you are innterested, send me an email for more details.
05/22: Two papers have just been accepted to ICML 2022! Congratulations to Margaux Zaffran and Alexis Ayme for their first PhD papers!
- Adaptive Conformal Predictions for Time Series. with M. Zaffran, O. Féron, Y. Goude, J. Josse
- Minimax rate of consistency for linear models with missing values with A. Ayme, C. Boyer, E. Scornet
March 2022 - LPSM. Federated Learning and optimization: from a gentle introduction to recent results Slides.
01/22: Three papers have been accepted to AISTATS 2022! Special congratulations to Maxence for his first paper.
- Differentially Private Federated Learning on Heterogeneous Data. with Maxence Noble, Aurélien Bellet
- Super-Acceleration with Cyclical Step-sizes with Baptiste Goujaud, Damien Scieur, Adrien Taylor, Fabian Pedregosa
- QLSD: Quantised Langevin stochastic dynamics for Bayesian federated learning. with Maxime Vono, Vincent Plassier, Alain Durmus and Eric Moulines
01/22: Happy new year! We have released the first version of PEPit: a python package on computer assisted proofs. This can be incredibly useful if you are interested in worst case guarantees on first order methods. The code is available on Github and you can have a look at
- the demo notebook,
- the preprint
- and the documentation.
11/21 - Two new preprints are availbale, respectively on Compression on Gaussian random codebooks with applications to FL, and on Utility-Privacy tradeoffs in Heterogeneous FL frameworks. See below for links and details!
Two papers have been accepted at Neurips 2021! Federated Expectation Maximization with heterogeneity mitigation and variance reduction, and Preserved central model for faster bidirectional compression in distributed settings, see below for links and details!
29/09 - FLOW - Federated Learning One World Seminar. Presentation on Preserved central model for faster bidirectional compression in distributed settings. Slides.
On September 16, 2021, join us at the Federated Learning Workshop, a full-day hybrid event that takes place both online and in Paris. A great panel of speakers from academia and industry will forecast the most promising directions for future research on federated learning and the development of new benchmarks and application challenges.
07/2021: Lab on Optimization, CEMRACS
Notebook on
optimization methods - COLAB ;
Notebook on
optimization methods;
Bits of code for the correction;
Notebook on
optimization methods - CORRECTION.
07/2021: Tutorial on Stochastic Optimization, Hi! PARIS Summer School 2021 on AI & Data for Science, Business and Society. Slides (without annotations); Slides (with annotations);
06/2021: A couple of new papers are available on arxiv, especially on ``super acceleration'' (faster than momentum acceleration under assumptions on the Hessian matrix) and Quantized Langevin Dynamics.
04/2021: Maxence Noble is starting his research internship! Maxence will be working on utility and privacy tradeoffs in Federated Learning, and is co-supervised by Aurélien Bellet.
04/2021: Alexis Ayme is starting his research internship and then PhD thesis! Alexis will be working on learning missing data, and is co-supervised by Claire Boyer and Erwan Scornet. His web page.
12/2020: Margaux Zaffran is starting her PhD! Margaux will be working on Electricity Price Prediction, with EDF. She is co-supervised with Julie Josse (Inria), Yannig Goude and Olivier Féron.
10/2020: Baptiste Goujaud is starting his PhD! Baptiste will be working on First Order Optimization. He is supervised by myself and Éric Moulines. Baptiste has already worked on optimization during his time at MILA, especially the tuning and convergence of first order optimization algorithms.
October 2020: Our team is still looking for a Postdocs and Research engineers, with very competitive conditions!! If you have a PhD in statistics, optimization and Machine learning and are interested to join a great team in Paris, send me an email.
09/2020: Our paper ``Debiasing Stochastic Gradient Descent to handle missing values'' has been accepted at NeurIPS2020. This is a joint work with Aude Sportisse, Claire Boyer and Julie Josse. see Arxiv version
06/2020: Our paper ``On Convergence-Diagnostic based Step Sizes for Stochastic Gradient Descent'' was accepted at ICML 2020. This is a joint work with Scott Pesme and Nicolas Flammarion at EPFL. see Arxiv version
04/2020: The applied mathematics department at École Polytechnique has open positions for Tenure Track professors, one on Statistics and the other one on Statistics and Energy. These positions offer competitive conditions.
10/03/2020: Optimization for Machine Learning workshop. in Luminy. On Convergence-Diagnostic based Step Sizes for Stochastic Gradient Descent, Slides here!
26/01/2020: The third edition of the "Advances in Machine Learning, theory meets practice", at Applied Machine Learning Days (AMLD) in Lausanne,
that we were co-organizing with Sebastian Stich was a nice occasion to bring theoreticians and
practitioners together: thanks to the speakers for their great talks!
See the workshop page for slides and details.
01/2020: Our paper on using optimal transport for NLP has been accepted to AISTATS2020!
12/2019: Our paper on unsupervised time series representation has successfully passed the Neurips reproducibility challenge! About 70 papers published at NeurIPS 2019 were picked by independent researchers, to reproduce the results, assess if the description of the framework was complete, and provide feedback. You can find the discussion on our paper here and the full 12 pages report on our work here. F. Liljefors, M. M. Sorkhei and S. Broomé were able to reproduce and reimplement our methods from the description given in the paper, and to obtain the same results as in our original paper! We would like to thank them for their work on our paper!
12/2019: I will be presenting two papers at NeurIPS 2019 in Vancouver. The first one one distributed optimization, more specifically Local SGD, with K. K. Patel, and the second one on a new methods to generate representations of time series in an unsupervised fashion, with J.Y. Franceschi M. Jaggi. Links to the papers below!
12/2019: Constantin Philippenko is starting his PhD! Constantin will be working on Federated Learning, especially on problems arising from privacy concerns. He will be supervised by myself and Éric Moulines, and will also be working with Richard Vidal and Leatitia Kameni from the research team at Accenture. Welcome to my first PhD student! :)
10/2019: I will be giving a lecture on Large Scale Learning at the Autumn School in Machine Learning, Tbilisi, Geogia. You can find the slides here.
Publications and Preprints
Workshops
Teaching
Reviews and Commintees
I was a member of the Jury for Belhal Karimi's PhD defense, on September, the 19th, 2019.
I am a referee for Luigi Carratino's PhD dissertation, that was defended in early spring 2020.
I am part of the scientific committee for the seminar le Palaisien.
Some Talks
March 2022 - LPSM. Federated Learning and optimization: from a gentle introduction to recent results Slides.
September 2021 - FLOW - Federated Learning One World Seminar. Presentation on Preserved central model for faster bidirectional compression in distributed settings. Slides.
March 2021, Bi-directional compression for Federated Learning: Artemis & MCM Séminaire, Télécom Paris. Slides here
February 2021, Debiasing Averaged Stochastic Gradient Descent to handle missing values, Séminaire de Statistiques Parisien. Slides here
March 2020, On Convergence-Diagnostic based Step Sizes for Stochastic Gradient Descent, Optimization for Machine Learning workshop. Slides here!
March 2020, On Convergence-Diagnostic based Step Sizes for Stochastic Gradient Descent, Parisian Seminar of Optimization.
January 2020, On Convergence-Diagnostic based Step Sizes for Stochastic Gradient Descent, Inria Paris
October 2019, Large Scale Learning and Optimization, Tbilisi,
Georgia. 6 hours lecture at ASML, slides here!
April 2019, Journées Calcul et Apprentissage, Lyon
December 2018, Communication trade-offs for synchronized
distributed SGD with large step size, CMStatistics 2018,
Pisa, Italy [slides]
January 2018, Tutoriel d’Optimization, Journées “YSP”
organisées par la SFDS, Institut
Henri Poincaré [slides]
Decembrer 2017, Stochastic algorithms in Machine
learning, Tutoriel at “journée algorithmes
stochastiques", Paris Dauphine. [slides]
November 2017, Stochastic approximation and
Markov chains, Invited talk, Télécom
Paristech, Paris [slides]
February 2017, Scalable methods for
Statistics, a short presentation,
Cambridge, UK . [slides]
March 2016, Non parametric stochastic approxiation, UC Berkeley.
October 2015, Tradeoffs of learning
in Hilbert spaces, ENSAI
Rennes. [slides]
June 2015, Non parametric
Stochastic Approximation,
Machine Learning Summer
School, Tubingen.
Thesis defense!
I defended my Accreditation to Supervise resaech on Tuesday, March 28, at 10 am/
You can download the final version of the manuscript
You can also have a look at the slides.Thesis defense!
I defended my thesis on Thursday, September 28, at 2.30 pm, at INRIA.
You can download the final version of the manuscript (or here if you want to print it).
You can also have a look at the slides.