publications

publications in reversed chronological order.

2025

  1. Favero2025Creativity.png
    How Compositional Generalization and Creativity Improve as Diffusion Models are Trained
    Alessandro Favero, Antonio Sclocchi, Francesco Cagnetta, and 2 more authors
    In 42nd International Conference on Machine Learning (ICML), 2025
  2. Cagnetta2025ScalingArchitectures.png
    Scaling Laws and Representation Learning in Simple Hierarchical Languages: Transformers vs. Convolutional Architectures
    Francesco Cagnetta, Alessandro Favero, Antonio Sclocchi, and 1 more author
    arXiv preprint, 2025
  3. Cagnetta2025LearningCurves.png
    Learning curves theory for hierarchically compositional data with power-law distributed features
    Francesco Cagnetta, Hyunmo Kang, and Matthieu Wyart
    In 42nd International Conference on Machine Learning (ICML), 2025

2024

  1. Cagnetta2024StructureLanguage.png
    Towards a theory of how the structure of language is acquired by deep neural networks
    Francesco Cagnetta and Matthieu Wyart
    In Advances in Neural Information Processing Systems (NeurIPS) 37, 2024
  2. Cagnetta2024RHM.png
    How Deep Neural Networks Learn Compositional Data: The Random Hierarchy Model
    Francesco Cagnetta*, Leonardo Petrini*, Umberto M. Tomasini, and 2 more authors
    Phys. Rev. X, 2024
  3. Cagnetta2024KernelsPhysics.png
    Kernels, Data & Physics
    Francesco Cagnetta, Deborah Oliveira, Mahalakshmi Sabanayagam, and 2 more authors
    J. Stat. Mech.: Theory Exp., 2024

2023

  1. Tomasini2023SpatialInfo.png
    How deep convolutional neural networks lose spatial information with training
    Umberto M. Tomasini*, Leonardo Petrini*, Francesco Cagnetta, and 1 more author
    Mach. learn.: sci. technol., 2023
  2. Cagnetta2023WideConvolutional.png
    What Can Be Learnt With Wide Convolutional Neural Networks?
    Francesco Cagnetta*, Alessandro Favero*, and Matthieu Wyart
    In 40th International Conference on Machine Learning (ICML), 2023

2022

  1. Petrini2022SparseFeatures.png
    Learning sparse features can lead to overfitting in neural networks
    Leonardo Petrini*, Francesco Cagnetta*, Eric Vanden-Eijnden, and 1 more author
    In Advances in Neural Information Processing Systems (NeurIPS) 35, 2022

2021

  1. Favero2021Locality.png
    Locality defeats the curse of dimensionality in convolutional teacher-student scenarios
    Alessandro Favero*, Francesco Cagnetta*, and Matthieu Wyart
    In Advances in Neural Information Processing Systems (NeurIPS) 34, 2021