C*-Algebra Net: A New Approach Generalizing Neural Network Parameters to C*-Algebra

Abstract

We propose a new framework that generalizes the parameters of neural network models to $C^*$-algebra-valued ones. $C^*$-algebra is a generalization of the space of complex numbers. A typical example is the space of continuous functions on a compact space. This generalization enables us to combine multiple models continuously and use tools for functions such as regression and integration. Consequently, we can learn features of data efficiently and adapt the models to problems continuously. We apply our framework to practical problems such as density estimation and few-shot learning and show that our framework enables us to learn features of data even with a limited number of samples. Our new framework highlights the potential possibility of applying the theory of $C^*$-algebra to general neural network models.

Cite

Text

Hashimoto et al. "C*-Algebra Net: A New Approach Generalizing Neural Network Parameters to C*-Algebra." International Conference on Machine Learning, 2022.

Markdown

[Hashimoto et al. "C*-Algebra Net: A New Approach Generalizing Neural Network Parameters to C*-Algebra." International Conference on Machine Learning, 2022.](https://mlanthology.org/icml/2022/hashimoto2022icml-algebra/)

BibTeX

@inproceedings{hashimoto2022icml-algebra,
  title     = {{C*-Algebra Net: A New Approach Generalizing Neural Network Parameters to C*-Algebra}},
  author    = {Hashimoto, Yuka and Wang, Zhao and Matsui, Tomoko},
  booktitle = {International Conference on Machine Learning},
  year      = {2022},
  pages     = {8523-8534},
  volume    = {162},
  url       = {https://mlanthology.org/icml/2022/hashimoto2022icml-algebra/}
}