Kernel Logistic Regression Approximation of an Understandable ReLU Neural Network
Abstract
This paper proposes an understandable neural network whose score function is modeled as an additive sum of univariate spline functions. It extends usual understandable models like generative additive models, spline-based models, and neural additive models. It is shown that this neural network can be approximated by a logistic regression whose inputs are obtained with a non-linear preprocessing of input data. This preprocessing depends on the neural network initialization but this paper establishes that it can be replaced by a non random kernel-based preprocessing that no longer depends on the initialization. Hence, the convergence of the training process is guaranteed and the solution is unique for a given training dataset.
Cite
Text
Guyomard et al. "Kernel Logistic Regression Approximation of an Understandable ReLU Neural Network." International Conference on Machine Learning, 2023.Markdown
[Guyomard et al. "Kernel Logistic Regression Approximation of an Understandable ReLU Neural Network." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/guyomard2023icml-kernel/)BibTeX
@inproceedings{guyomard2023icml-kernel,
title = {{Kernel Logistic Regression Approximation of an Understandable ReLU Neural Network}},
author = {Guyomard, Marie and Barbosa, Susana and Fillatre, Lionel},
booktitle = {International Conference on Machine Learning},
year = {2023},
pages = {12268-12291},
volume = {202},
url = {https://mlanthology.org/icml/2023/guyomard2023icml-kernel/}
}