Cagnetta, Francesco

10 publications

ICML 2025 How Compositional Generalization and Creativity Improve as Diffusion Models Are Trained Alessandro Favero, Antonio Sclocchi, Francesco Cagnetta, Pascal Frossard, Matthieu Wyart
ICLRW 2025 How Compositional Generalization and Creativity Improve as Diffusion Models Are Trained Alessandro Favero, Antonio Sclocchi, Francesco Cagnetta, Pascal Frossard, Matthieu Wyart
ICML 2025 Learning Curves Theory for Hierarchically Compositional Data with Power-Law Distributed Features Francesco Cagnetta, Hyunmo Kang, Matthieu Wyart
NeurIPSW 2024 How Rare Events Shape the Learning Curves of Hierarchical Data Hyunmo Kang, Francesco Cagnetta, Matthieu Wyart
NeurIPSW 2024 Token-Token Correlations Predict the Scaling of the Test Loss with the Number of Input Tokens Francesco Cagnetta, Matthieu Wyart
NeurIPS 2024 Towards a Theory of How the Structure of Language Is Acquired by Deep Neural Networks Francesco Cagnetta, Matthieu Wyart
ICLRW 2023 How Deep Convolutional Neural Networks Lose Spatial Information with Training Umberto Maria Tomasini, Leonardo Petrini, Francesco Cagnetta, Matthieu Wyart
ICML 2023 What Can Be Learnt with Wide Convolutional Neural Networks? Francesco Cagnetta, Alessandro Favero, Matthieu Wyart
NeurIPS 2022 Learning Sparse Features Can Lead to Overfitting in Neural Networks Leonardo Petrini, Francesco Cagnetta, Eric Vanden-Eijnden, Matthieu Wyart
NeurIPS 2021 Locality Defeats the Curse of Dimensionality in Convolutional Teacher-Student Scenarios Alessandro Favero, Francesco Cagnetta, Matthieu Wyart