Multi-Level Optimal Transport for Universal Cross-Tokenizer Knowledge Distillation on Language Models

Cite

Text

Cui et al. "Multi-Level Optimal Transport for Universal Cross-Tokenizer Knowledge Distillation on Language Models." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I22.34543

Markdown

[Cui et al. "Multi-Level Optimal Transport for Universal Cross-Tokenizer Knowledge Distillation on Language Models." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/cui2025aaai-multi/) doi:10.1609/AAAI.V39I22.34543

BibTeX

@inproceedings{cui2025aaai-multi,
  title     = {{Multi-Level Optimal Transport for Universal Cross-Tokenizer Knowledge Distillation on Language Models}},
  author    = {Cui, Xiao and Zhu, Mo and Qin, Yulei and Xie, Liang and Zhou, Wengang and Li, Houqiang},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {23724-23732},
  doi       = {10.1609/AAAI.V39I22.34543},
  url       = {https://mlanthology.org/aaai/2025/cui2025aaai-multi/}
}