Alessandro Favero


Hey there! I’m a PhD candidate at EPFL, where I’m fortunate to be advised by Matthieu Wyart and Pascal Frossard. During the summer of 2023, I interned as an applied scientist at Amazon’s AWS AI Labs in the Bay Area.

My research revolves around understanding the fundamental principles underlying deep learning. Currently, I am particularly interested in the interplay between sample complexity and data structure and in large-scale vision-language models.

Prior to my PhD, I earned a joint Master’s degree in theoretical physics from Sorbonne Université, Politecnico di Torino, SISSA, and ICTP.

recent articles

  1. Multi-modal hallucination control by visual information grounding
    Alessandro Favero, Luca Zancato, Matthew Trager, Siddharth Choudhary, Pramuditha Perera, Alessandro Achille, and 2 more authors
    Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2024
  2. A phase transition in diffusion models reveals the hierarchical nature of data
    Antonio Sclocchi, Alessandro Favero, and Matthieu Wyart
    arXiv preprint, 2024


Dec 2023 Task Arithmetic in the Tangent Space is an Oral at NeurIPS 2023. See you in NOLA! After the conference, I’m presenting our work at the EPFL CIS NeurIPS 2023 Regional Post-Event.
Jul 2023 This summer, I’m an applied scientist intern at Amazon Science in AWS AI Labs in the San Francisco Bay Area working on hallucinations in vision-language foundation models.
Mar 2023 I’m giving a talk in the Statistical Physics Meets Machine Learning session of the APS March Meeting in Las Vegas! After, I’m visiting MIT, NYU, and Simons Foundation.
Nov 2022 Our 2021 NeurIPS papers on locality and stability to diffeomorphisms have been published in JSTAT as part of the Special Issue on the Statistical Physics Aspects of Machine Learning and AI.
Jun 2022 This summer, I’m attending the Machine Learning Theory Summer School at Princeton University and the Summer School on Statistical Physics and Machine Learning at Les Houches School of Physics.
Apr 2022 I’m giving a lightning talk on the locality prior of convolutional networks at the Workshop on the Theory of Overparameterized Machine Learning organised by Rice University.