Complexity of Neural Network Training and ETR: Extensions with Effectively Continuous Functions

Abstract

The training problem of neural networks (NNs) is known to be ER-complete with respect to ReLU and linear activation functions. We show that the training problem for NNs equipped with arbitrary activation functions is polynomial-time bireducible to the existential theory of the reals extended with the corresponding activation functions. For effectively continuous activation functions (e.g., the sigmoid function), we obtain an inclusion to low levels of the arithmetical hierarchy. Consequently, the sigmoid activation function leads to the existential theory of the reals with the exponential function, and hence the decidability of training NNs using the sigmoid activation function is equivalent to the decidability of the existential theory of the reals with the exponential function, a long-standing open problem. In contrast, we obtain that the training problem is undecidable if sinusoidal activation functions are considered.

Cite

Text

Hankala et al. "Complexity of Neural Network Training and ETR: Extensions with Effectively Continuous Functions." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I11.29118

Markdown

[Hankala et al. "Complexity of Neural Network Training and ETR: Extensions with Effectively Continuous Functions." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/hankala2024aaai-complexity/) doi:10.1609/AAAI.V38I11.29118

BibTeX

@inproceedings{hankala2024aaai-complexity,
  title     = {{Complexity of Neural Network Training and ETR: Extensions with Effectively Continuous Functions}},
  author    = {Hankala, Teemu and Hannula, Miika and Kontinen, Juha and Virtema, Jonni},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2024},
  pages     = {12278-12285},
  doi       = {10.1609/AAAI.V38I11.29118},
  url       = {https://mlanthology.org/aaai/2024/hankala2024aaai-complexity/}
}