Max-Margin Multiple-Instance Dictionary Learning

Abstract

Dictionary learning has became an increasingly important task in machine learning, as it is fundamental to the representation problem. A number of emerging techniques specifically include a codebook learning step, in which a critical knowledge abstraction process is carried out. Existing approaches in dictionary (codebook) learning are either generative (unsupervised e.g. k-means) or discriminative (supervised e.g. extremely randomized forests). In this paper, we propose a multiple instance learning (MIL) strategy (along the line of weakly supervised learning) for dictionary learning. Each code is represented by a classifier, such as a linear SVM, which naturally performs metric fusion for multi-channel features. We design a formulation to simultaneously learn mixtures of codes by maximizing classification margins in MIL. State-of-the-art results are observed in image classification benchmarks based on the learned codebooks, which observe both compactness and effectiveness.

Cite

Text

Wang et al. "Max-Margin Multiple-Instance Dictionary Learning." International Conference on Machine Learning, 2013.

Markdown

[Wang et al. "Max-Margin Multiple-Instance Dictionary Learning." International Conference on Machine Learning, 2013.](https://mlanthology.org/icml/2013/wang2013icml-maxmargin/)

BibTeX

@inproceedings{wang2013icml-maxmargin,
  title     = {{Max-Margin Multiple-Instance Dictionary Learning}},
  author    = {Wang, Xinggang and Wang, Baoyuan and Bai, Xiang and Liu, Wenyu and Tu, Zhuowen},
  booktitle = {International Conference on Machine Learning},
  year      = {2013},
  pages     = {846-854},
  volume    = {28},
  url       = {https://mlanthology.org/icml/2013/wang2013icml-maxmargin/}
}