Distilling Universal and Joint Knowledge for Cross-Domain Model Compression on Time Series Data

Cite

Text

Xu et al. "Distilling Universal and Joint Knowledge for Cross-Domain Model Compression on Time Series Data." International Joint Conference on Artificial Intelligence, 2023. doi:10.24963/IJCAI.2023/496

Markdown

[Xu et al. "Distilling Universal and Joint Knowledge for Cross-Domain Model Compression on Time Series Data." International Joint Conference on Artificial Intelligence, 2023.](https://mlanthology.org/ijcai/2023/xu2023ijcai-distilling/) doi:10.24963/IJCAI.2023/496

BibTeX

@inproceedings{xu2023ijcai-distilling,
  title     = {{Distilling Universal and Joint Knowledge for Cross-Domain Model Compression on Time Series Data}},
  author    = {Xu, Qing and Wu, Min and Li, Xiaoli and Mao, Kezhi and Chen, Zhenghua},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2023},
  pages     = {4460-4468},
  doi       = {10.24963/IJCAI.2023/496},
  url       = {https://mlanthology.org/ijcai/2023/xu2023ijcai-distilling/}
}