TOCOL: Improving Contextual Representation of Pre-Trained Language Models via Token-Level Contrastive Learning

Cite

Text

Wang et al. "TOCOL: Improving Contextual Representation of Pre-Trained Language Models via Token-Level Contrastive Learning." Machine Learning, 2024. doi:10.1007/S10994-023-06512-9

Markdown

[Wang et al. "TOCOL: Improving Contextual Representation of Pre-Trained Language Models via Token-Level Contrastive Learning." Machine Learning, 2024.](https://mlanthology.org/mlj/2024/wang2024mlj-tocol/) doi:10.1007/S10994-023-06512-9

BibTeX

@article{wang2024mlj-tocol,
  title     = {{TOCOL: Improving Contextual Representation of Pre-Trained Language Models via Token-Level Contrastive Learning}},
  author    = {Wang, Keheng and Yin, Chuantao and Li, Rumei and Wang, Sirui and Xian, Yunsen and Rong, Wenge and Xiong, Zhang},
  journal   = {Machine Learning},
  year      = {2024},
  pages     = {3999-4012},
  doi       = {10.1007/S10994-023-06512-9},
  volume    = {113},
  url       = {https://mlanthology.org/mlj/2024/wang2024mlj-tocol/}
}