Enhancing Sufficient Dimension Reduction via Hellinger Correlation

Abstract

In this work, we develop a new theory and method for sufficient dimension reduction (SDR) in single-index models, where SDR is a sub-field of supervised dimension reduction based on conditional independence. Our work is primarily motivated by the recent introduction of the Hellinger correlation as a dependency measure. Utilizing this measure, we have developed a method capable of effectively detecting the dimension reduction subspace, complete with theoretical justification. Through extensive numerical experiments, we demonstrate that our proposed method significantly enhances and outperforms existing SDR methods. This improvement is largely attributed to our proposed method’s deeper understanding of data dependencies and the refinement of existing SDR techniques.

Cite

Text

Hong et al. "Enhancing Sufficient Dimension Reduction via Hellinger Correlation." International Conference on Machine Learning, 2024.

Markdown

[Hong et al. "Enhancing Sufficient Dimension Reduction via Hellinger Correlation." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/hong2024icml-enhancing/)

BibTeX

@inproceedings{hong2024icml-enhancing,
  title     = {{Enhancing Sufficient Dimension Reduction via Hellinger Correlation}},
  author    = {Hong, Seungbeom and Kim, Ilmun and Song, Jun},
  booktitle = {International Conference on Machine Learning},
  year      = {2024},
  pages     = {18634-18647},
  volume    = {235},
  url       = {https://mlanthology.org/icml/2024/hong2024icml-enhancing/}
}