The Role of Over-Parameterization in Machine Learning - The Good, the Bad, the Ugly

Abstract

The conventional wisdom of simple models in machine learning misses the bigger picture, especially over-parameterized neural networks (NNs), where the number of parameters are much larger than the number of training data. Our goal is to explore the mystery behind over-parameterized models from a theoretical side. In this talk, I will discuss the role of over-parameterization in neural networks, to theoretically understand why they can perform well. First, I will discuss the role of over-parameterization in neural networks from the perspective of models, to theoretically understand why they can genralize well. Second, the effects of over-parameterization in robustness, privacy are discussed. Third, I will talk about the over-parameterization from kernel methods to neural networks in a function space theory view. Besides, from classical statistical learning to sequential decision making, I will talk about the benefits of over-parameterization on how deep reinforcement learning works well for function approximation. Potential future directions on theory of over-parameterization ML will also be discussed.

Cite

Text

Liu. "The Role of Over-Parameterization in Machine Learning - The Good, the Bad, the Ugly." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I20.30290

Markdown

[Liu. "The Role of Over-Parameterization in Machine Learning - The Good, the Bad, the Ugly." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/liu2024aaai-role/) doi:10.1609/AAAI.V38I20.30290

BibTeX

@inproceedings{liu2024aaai-role,
  title     = {{The Role of Over-Parameterization in Machine Learning - The Good, the Bad, the Ugly}},
  author    = {Liu, Fanghui},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2024},
  pages     = {22674},
  doi       = {10.1609/AAAI.V38I20.30290},
  url       = {https://mlanthology.org/aaai/2024/liu2024aaai-role/}
}