Mixtures of Local Linear Subspaces for Face Recognition
Abstract
Traditional subspace methods for face recognition compute a measure of similarity between images after projecting them onto a fixed linear subspace that is spanned by some principal component vectors (a.k.a. "eigenfaces") of a training set of images. By supposing a parametric Gaussian distribution over the subspace and a symmetric Gaussian noise model for the image given a point in the subspace, we can endow this framework with a probabilistic interpretation so that Bayes-optimal decisions can be made. However, we expect that different image clusters (corresponding, say, to different poses and expressions) will be best represented by different subspaces. In this paper, we study the recognition performance of a mixture of local linear subspaces model that can be fit to training data using the expectation maximization algorithm. The mixture model outperforms a nearest-neighbor classifier that operates in a PCA subspace.
Cite
Text
Frey et al. "Mixtures of Local Linear Subspaces for Face Recognition." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 1998. doi:10.1109/CVPR.1998.698584Markdown
[Frey et al. "Mixtures of Local Linear Subspaces for Face Recognition." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 1998.](https://mlanthology.org/cvpr/1998/frey1998cvpr-mixtures/) doi:10.1109/CVPR.1998.698584BibTeX
@inproceedings{frey1998cvpr-mixtures,
title = {{Mixtures of Local Linear Subspaces for Face Recognition}},
author = {Frey, Brendan J. and Colmenarez, Antonio and Huang, Thomas S.},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition},
year = {1998},
pages = {32-37},
doi = {10.1109/CVPR.1998.698584},
url = {https://mlanthology.org/cvpr/1998/frey1998cvpr-mixtures/}
}