Smooth Min-Max Monotonic Networks

Abstract

Monotonicity constraints are powerful regularizers in statistical modelling. They can support fairness in computer-aided decision making and increase plausibility in data-driven scientific models. The seminal min-max (MM) neural network architecture ensures monotonicity, but often gets stuck in undesired local optima during training because of partial derivatives being zero when computing extrema. We propose a simple modification of the MM network using strictly-increasing smooth minimum and maximum functions that alleviates this problem. The resulting smooth min-max (SMM) network module inherits the asymptotic approximation properties from the MM architecture. It can be used within larger deep learning systems trained end-to-end. The SMM module is conceptually simple and computationally less demanding than state-of-the-art neural networks for monotonic modelling. Our experiments show that this does not come with a loss in generalization performance compared to alternative neural and non-neural approaches.

Cite

Text

Igel. "Smooth Min-Max Monotonic Networks." International Conference on Machine Learning, 2024.

Markdown

[Igel. "Smooth Min-Max Monotonic Networks." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/igel2024icml-smooth/)

BibTeX

@inproceedings{igel2024icml-smooth,
  title     = {{Smooth Min-Max Monotonic Networks}},
  author    = {Igel, Christian},
  booktitle = {International Conference on Machine Learning},
  year      = {2024},
  pages     = {20908-20923},
  volume    = {235},
  url       = {https://mlanthology.org/icml/2024/igel2024icml-smooth/}
}