EntropyRank: Unsupervised Keyphrase Extraction via Side-Information Optimization for Language Model-Based Text Compression

Abstract

We propose an unsupervised method to extract keywords and keyphrases from texts based on a pre-trained language model (LM) and Shannon's information maximization. Specifically, our method extracts phrases having the highest conditional entropy under the LM. The resulting set of keyphrases turns out to solve a relevant information-theoretic problem: if provided as side information, it leads to the expected minimal binary code length in compressing the text using the LM and an entropy encoder. Alternately, the resulting set is an approximation via a causal LM to the set of phrases that minimize the entropy of the text when conditioned upon it. Empirically, the method provides results comparable to the most commonly used methods in various keyphrase extraction benchmark challenges.

Cite

Text

Tsvetkov and Kipnis. "EntropyRank: Unsupervised Keyphrase Extraction via Side-Information Optimization for Language Model-Based Text Compression." ICML 2023 Workshops: NCW, 2023.

Markdown

[Tsvetkov and Kipnis. "EntropyRank: Unsupervised Keyphrase Extraction via Side-Information Optimization for Language Model-Based Text Compression." ICML 2023 Workshops: NCW, 2023.](https://mlanthology.org/icmlw/2023/tsvetkov2023icmlw-entropyrank/)

BibTeX

@inproceedings{tsvetkov2023icmlw-entropyrank,
  title     = {{EntropyRank: Unsupervised Keyphrase Extraction via Side-Information Optimization for Language Model-Based Text Compression}},
  author    = {Tsvetkov, Alexander and Kipnis, Alon},
  booktitle = {ICML 2023 Workshops: NCW},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/tsvetkov2023icmlw-entropyrank/}
}