Primal and Dual Analysis of Entropic Fictitious Play for Finite-Sum Problems

Abstract

The entropic fictitious play (EFP) is a recently proposed algorithm that minimizes the sum of a convex functional and entropy in the space of measures — such an objective naturally arises in the optimization of a two-layer neural network in the mean-field regime. In this work, we provide a concise primal-dual analysis of EFP in the setting where the learning problem exhibits a finite-sum structure. We establish quantitative global convergence guarantees for both the continuous-time and discrete-time dynamics based on properties of a proximal Gibbs measure introduced in Nitanda et al. (2022). Furthermore, our primal-dual framework entails a memory-efficient particle-based implementation of the EFP update, and also suggests a connection to gradient boosting methods. We illustrate the efficiency of our novel implementation in experiments including neural network optimization and image synthesis.

Cite

Text

Nitanda et al. "Primal and Dual Analysis of Entropic Fictitious Play for Finite-Sum Problems." International Conference on Machine Learning, 2023.

Markdown

[Nitanda et al. "Primal and Dual Analysis of Entropic Fictitious Play for Finite-Sum Problems." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/nitanda2023icml-primal/)

BibTeX

@inproceedings{nitanda2023icml-primal,
  title     = {{Primal and Dual Analysis of Entropic Fictitious Play for Finite-Sum Problems}},
  author    = {Nitanda, Atsushi and Oko, Kazusato and Wu, Denny and Takenouchi, Nobuhito and Suzuki, Taiji},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {26266-26282},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/nitanda2023icml-primal/}
}