Batiment Alan Turing,
I am an assistant professor in Statistics at École Polytechnique, Palaiseau, in the applied mathematics departement.
Before that, I was postdoctoral Researcher at EPFL (Ecole Polytechnique Fédérale de Lausanne), in the MLO team, directed by Martin Jaggi. Before that, I was Ph.D. student in the Sierra Team, which is part of the DI/ENS (Computer Science Department of École Normale Supérieure). I graduated from Ecole Normale Supérieure de Paris (Ulm) in 2014 and got a Masters Degree in Mathematics, Probability and Statistics (at Université Paris-Sud, Orsay).
I was supervised by Francis Bach. My main research interests are statistics, optimization, stochastic approximation, high-dimensional learning, non-parametric statistics, scalable kernel methods.
From March to August 2016, I was a visiting scholar researcher at University of California Berkeley, under the supervision of Martin Wainwright.
October 2020: Our team is still looking for a Postdocs and Research engineers, with very competitive conditions!! If you have a PhD in statistics, optimization and Machine learning and are interested to join a great team in Paris, send me an email.
09/2020: Our paper ``Debiasing Stochastic Gradient Descent to handle missing values'' has been accepted at NeurIPS2020. This is a joint work with Aude Sportisse, Claire Boyer and Julie Josse. see Arxiv version
06/2020: Our paper ``On Convergence-Diagnostic based Step Sizes for Stochastic Gradient Descent'' was accepted at ICML 2020. This is a joint work with Scott Pesme and Nicolas Flammarion at EPFL. see Arxiv version
04/2020: The applied mathematics department at École Polytechnique has open positions for Tenure Track professors, one on Statistics and the other one on Statistics and Energy. These positions offer competitive conditions.
26/01/2020: The third edition of the "Advances in Machine Learning, theory meets practice", at Applied Machine Learning Days (AMLD) in Lausanne,
that we were co-organizing with Sebastian Stich was a nice occasion to bring theoreticians and
practitioners together: thanks to the speakers for their great talks!
See the workshop page for slides and details.
01/2020: Our paper on using optimal transport for NLP has been accepted to AISTATS2020!
12/2019: Our paper on unsupervised time series representation has successfully passed the Neurips reproducibility challenge! About 70 papers published at NeurIPS 2019 were picked by independent researchers, to reproduce the results, assess if the description of the framework was complete, and provide feedback. You can find the discussion on our paper here and the full 12 pages report on our work here. F. Liljefors, M. M. Sorkhei and S. Broomé were able to reproduce and reimplement our methods from the description given in the paper, and to obtain the same results as in our original paper! We would like to thank them for their work on our paper!
12/2019: I will be presenting two papers at NeurIPS 2019 in Vancouver. The first one one distributed optimization, more specifically Local SGD, with K. K. Patel, and the second one on a new methods to generate representations of time series in an unsupervised fashion, with J.Y. Franceschi M. Jaggi. Links to the papers below !
12/2019: Constantin Philippenko is starting his PhD ! Constantin will be working on Federated Learning, especially on problems arising from privacy concerns. He will be supervised by myself and Éric Moulines, and will also be working with Richard Vidal and Leatitia Kameni from the research team at Accenture. Welcome to my first PhD student ! :)
Reviews and Commintees
I was a member of the Jury for Belhal Karimi's PhD defense, on September, the 19th, 2019.
I am a referee for Luigi Carratino's PhD dissertation, that was defended in early spring 2020.
I am part of the scientific committee for the seminar le Palaisien.
March 2020, On Convergence-Diagnostic based Step Sizes for Stochastic Gradient Descent, Parisian Seminar of Optimization.
January 2020, On Convergence-Diagnostic based Step Sizes for Stochastic Gradient Descent, Inria Paris
October 2019, Large Scale Learning and Optimization, Tbilisi,
Georgia. 6 hours lecture at ASML, slides here!
April 2019, Journées Calcul et Apprentissage, Lyon
December 2018, Communication trade-offs for synchronized
distributed SGD with large step size, CMStatistics 2018,
Pisa, Italy [slides]
January 2018, Tutoriel d’Optimization, Journées “YSP”
organisées par la SFDS, Institut
Henri Poincaré [slides]
Decembrer 2017, Stochastic algorithms in Machine
learning, Tutoriel at “journée algorithmes
stochastiques", Paris Dauphine. [slides]
November 2017, Stochastic approximation and
Markov chains, Invited talk, Télécom
Paristech, Paris [slides]
February 2017, Scalable methods for
Statistics, a short presentation,
Cambridge, UK . [slides]
March 2016, Non parametric stochastic approxiation, UC Berkeley.
October 2015, Tradeoffs of learning
in Hilbert spaces, ENSAI
June 2015, Non parametric
Machine Learning Summer
I defended my thesis on Thursday, September 28, at 2.30 pm, at INRIA.slides.