Local Context Sparse Coding

Abstract

The n-gram model has been widely used to capture the local ordering of words, yet its exploding feature space often causes an estimation issue. This paper presents local context sparse coding (LCSC), a non-probabilistic topic model that effectively handles large feature spaces using sparse coding. In addition, it introduces a new concept of locality, local contexts, which provides a representation that can generate locally coherent topics and document representations. Our model efficiently finds topics and representations by applying greedy coordinate descent updates. The model is useful for discovering local topics and the semantic flow of a document, as well as constructing predictive models.

Cite

Text

Kim et al. "Local Context Sparse Coding." AAAI Conference on Artificial Intelligence, 2015. doi:10.1609/AAAI.V29I1.9518

Markdown

[Kim et al. "Local Context Sparse Coding." AAAI Conference on Artificial Intelligence, 2015.](https://mlanthology.org/aaai/2015/kim2015aaai-local/) doi:10.1609/AAAI.V29I1.9518

BibTeX

@inproceedings{kim2015aaai-local,
  title     = {{Local Context Sparse Coding}},
  author    = {Kim, Seungyeon and Lee, Joonseok and Lebanon, Guy and Park, Haesun},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2015},
  pages     = {2260-2266},
  doi       = {10.1609/AAAI.V29I1.9518},
  url       = {https://mlanthology.org/aaai/2015/kim2015aaai-local/}
}