The aim of my research is to enhance our understanding of modern machine learning algorithms. In particular, I’m interested in exploring how the
geometric structure of natural data, for instance symmetries, invariances, and compositionality, enables efficient learning in high dimensions. I enjoy working both studying analytical (toy) models and performing numerical experiments.
Previously, I completed a joint Master’s degree in theoretical physics at Sorbonne Université, Politecnico di Torino, SISSA, and ICTP.
|Nov 2022||Our 2021 NeurIPS papers on locality and stability to diffeomorphisms have been published in JSTAT as part of the Special Issue on the Statistical Physics Aspects of Machine Learning and AI.|
|Aug 2022||A new preprint is out! We show how the multi-scale structure of deep CNNs reflects into the spectral structure of their NTK allowing them to adapt to the spatial scale of the task!|
|Jun 2022||This summer, I’m attending the Machine Learning Theory Summer School at Princeton University and the Summer School on Statistical Physics and Machine Learning at Les Houches School of Physics.|
|Apr 2022||I’m giving a lightning talk on the locality prior of convolutional networks at the Workshop on the Theory of Overparameterized Machine Learning organised by Rice University.|