Bayesian Maximum Margin Principal Component Analysis

Abstract

Supervised dimensionality reduction has shown great advantages in finding predictive subspaces. Previous methods rarely consider the popular maximum margin principle and are prone to overfitting to usually small training data, especially for those under the maximum likelihood framework. In this paper, we present a posterior-regularized Bayesian approach to combine Principal Component Analysis (PCA) with the max-margin learning. Based on the data augmentation idea for max-margin learning and the probabilistic interpretation of PCA, our method can automatically infer the weight and penalty parameter of max-margin learning machine, while finding the most appropriate PCA subspace simultaneously under the Bayesian framework. We develop a fast mean-field variational inference algorithm to approximate the posterior. Experimental results on various classification tasks show that our method outperforms a number of competitors.

Cite

Text

Du et al. "Bayesian Maximum Margin Principal Component Analysis." AAAI Conference on Artificial Intelligence, 2015. doi:10.1609/AAAI.V29I1.9583

Markdown

[Du et al. "Bayesian Maximum Margin Principal Component Analysis." AAAI Conference on Artificial Intelligence, 2015.](https://mlanthology.org/aaai/2015/du2015aaai-bayesian/) doi:10.1609/AAAI.V29I1.9583

BibTeX

@inproceedings{du2015aaai-bayesian,
  title     = {{Bayesian Maximum Margin Principal Component Analysis}},
  author    = {Du, Changying and Zhe, Shandian and Zhuang, Fuzhen and Qi, Yuan and He, Qing and Shi, Zhongzhi},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2015},
  pages     = {2582-2588},
  doi       = {10.1609/AAAI.V29I1.9583},
  url       = {https://mlanthology.org/aaai/2015/du2015aaai-bayesian/}
}