Multi-View Oriented GPLVM: Expressiveness and Efficiency

Abstract

The multi-view Gaussian process latent variable model (MV-GPLVM) aims to learn a unified representation from multi-view data but is hindered by challenges such as limited kernel expressiveness and low computational efficiency. To overcome these issues, we first introduce a new duality between the spectral density and the kernel function. By modeling the spectral density with a bivariate Gaussian mixture, we then derive a generic and expressive kernel termed Next-Gen Spectral Mixture (NG-SM) for MV-GPLVMs. To address the inherent computational inefficiency of the NG-SM kernel, we propose a random Fourier feature approximation. Combined with a tailored reparameterization trick, this approximation enables scalable variational inference for both the model and the unified latent representations. Numerical evaluations across a diverse range of multi-view datasets demonstrate that our proposed method consistently outperforms state-of-the-art models in learning meaningful latent representations.

Cite

Text

Yang et al. "Multi-View Oriented GPLVM: Expressiveness and Efficiency." Advances in Neural Information Processing Systems, 2025.

Markdown

[Yang et al. "Multi-View Oriented GPLVM: Expressiveness and Efficiency." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/yang2025neurips-multiview/)

BibTeX

@inproceedings{yang2025neurips-multiview,
  title     = {{Multi-View Oriented GPLVM: Expressiveness and Efficiency}},
  author    = {Yang, Zi and Li, Ying and Lin, Zhidi and Zhang, Michael Minyi and Olmos, Pablo M.},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/yang2025neurips-multiview/}
}