On the Eigenvectors of P-Laplacian

Abstract

Spectral analysis approaches have been actively studied in machine learning and data mining areas, due to their generality, efficiency, and rich theoretical foundations. As a natural non-linear generalization of Graph Laplacian, p -Laplacian has recently been proposed, which interpolates between a relaxation of normalized cut and the Cheeger cut. However, the relaxation can only be applied to two-class cases. In this paper, we propose full eigenvector analysis of p -Laplacian and obtain a natural global embedding for multi-class clustering problems, instead of using greedy search strategy implemented by previous researchers. An efficient gradient descend optimization approach is introduced to obtain the p -Laplacian embedding space, which is guaranteed to converge to feasible local solutions. Empirical results suggest that the greedy search method often fails in many real-world applications with non-trivial data structures, but our approach consistently gets robust clustering results. Visualizations of experimental results also indicate our embedding space preserves the local smooth manifold structures existing in real-world data.

Cite

Text

Luo et al. "On the Eigenvectors of P-Laplacian." Machine Learning, 2010. doi:10.1007/S10994-010-5201-Z

Markdown

[Luo et al. "On the Eigenvectors of P-Laplacian." Machine Learning, 2010.](https://mlanthology.org/mlj/2010/luo2010mlj-eigenvectors/) doi:10.1007/S10994-010-5201-Z

BibTeX

@article{luo2010mlj-eigenvectors,
  title     = {{On the Eigenvectors of P-Laplacian}},
  author    = {Luo, Dijun and Huang, Heng and Ding, Chris H. Q. and Nie, Feiping},
  journal   = {Machine Learning},
  year      = {2010},
  pages     = {37-51},
  doi       = {10.1007/S10994-010-5201-Z},
  volume    = {81},
  url       = {https://mlanthology.org/mlj/2010/luo2010mlj-eigenvectors/}
}