Multi-Stage Balanced Distillation: Addressing Long-Tail Challenges in Sequence-Level Knowledge Distillation

Year
2024
Type(s)
Author(s)
Zhou, Yuhang, Jing Zhu, Paiheng Xu, Xiaoyu Liu, Xiyao Wang, Danai Koutra, Wei Ai, and Furong Huang.
Source
The 2024 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2024.
Url
https://arxiv.org/abs/2406.13114
BibTeX
BibTeX