Computational Complexity of Kernel-Based Density-Ratio Estimation: A Condition Number Analysis

Abstract

In this study, the computational properties of a kernel-based least-squares density-ratio estimator are investigated from the viewpoint of condition numbers . The condition number of the Hessian matrix of the loss function is closely related to the convergence rate of optimization and the numerical stability. We use smoothed analysis techniques and theoretically demonstrate that the kernel least-squares method has a smaller condition number than other M-estimators. This implies that the kernel least-squares method has desirable computational properties. In addition, an alternate formulation of the kernel least-squares estimator that possesses an even smaller condition number is presented. The validity of the theoretical analysis is verified through numerical experiments.

Cite

Text

Kanamori et al. "Computational Complexity of Kernel-Based Density-Ratio Estimation: A Condition Number Analysis." Machine Learning, 2013. doi:10.1007/S10994-012-5323-6

Markdown

[Kanamori et al. "Computational Complexity of Kernel-Based Density-Ratio Estimation: A Condition Number Analysis." Machine Learning, 2013.](https://mlanthology.org/mlj/2013/kanamori2013mlj-computational/) doi:10.1007/S10994-012-5323-6

BibTeX

@article{kanamori2013mlj-computational,
  title     = {{Computational Complexity of Kernel-Based Density-Ratio Estimation: A Condition Number Analysis}},
  author    = {Kanamori, Takafumi and Suzuki, Taiji and Sugiyama, Masashi},
  journal   = {Machine Learning},
  year      = {2013},
  pages     = {431-460},
  doi       = {10.1007/S10994-012-5323-6},
  volume    = {90},
  url       = {https://mlanthology.org/mlj/2013/kanamori2013mlj-computational/}
}