Neural Networks Perform Sufficient Dimension Reduction

Abstract

This paper investigates the connection between neural networks and sufficient dimension reduction (SDR), demonstrating that neural networks inherently perform SDR in regression tasks under appropriate rank regularizations. Specifically, the weights in the first layer span the central mean subspace. We establish the statistical consistency of the neural network-based estimator for the central mean subspace, underscoring the suitability of neural networks in addressing SDR-related challenges. Numerical experiments further validate our theoretical findings, and highlight the underlying capability of neural networks to facilitate SDR compared to the existing methods. Additionally, we discuss an extension to unravel the central subspace, broadening the scope of our investigation.

Cite

Text

Xu and Yu. "Neural Networks Perform Sufficient Dimension Reduction." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I20.35486

Markdown

[Xu and Yu. "Neural Networks Perform Sufficient Dimension Reduction." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/xu2025aaai-neural/) doi:10.1609/AAAI.V39I20.35486

BibTeX

@inproceedings{xu2025aaai-neural,
  title     = {{Neural Networks Perform Sufficient Dimension Reduction}},
  author    = {Xu, Shuntuo and Yu, Zhou},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {21806-21814},
  doi       = {10.1609/AAAI.V39I20.35486},
  url       = {https://mlanthology.org/aaai/2025/xu2025aaai-neural/}
}