運用響應式知識蒸餾機制增進中文多標籤文本分類效能

Translated title of the contribution: Enhancing Chinese Multi-Label Text Classification Performance with Response-based Knowledge Distillation

Szu Chi Huang, Cheng Fu Cao, Po Hsun Liao, Lung Hao Lee, Po Lei Lee, Kuo Kai Shyu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

It's difficult to optimize individual label performance of multi-label text classification, especially in those imbalanced data containing long-tailed labels. Therefore, this study proposes a response-based knowledge distillation mechanism comprising a teacher model that optimizes binary classifiers of the corresponding labels and a student model that is a standalone multi-label classifier learning from distilled knowledge passed by the teacher model. A total of 2,724 Chinese healthcare texts were collected and manually annotated across nine defined labels, resulting in 8731 labels, each containing an average of 3.2 labels. We used 5-fold cross-validation to compare the performance of several multi-label models, including TextRNN, TextCNN, HAN, and GRU-att. Experimental results indicate that using the proposed knowledge distillation mechanism effectively improved the performance no matter which model was used, about 2-3% of micro-F1, 4-6% of macro-F1, 3-4% of weighted-F1 and 1-2% of subset accuracy for performance enhancement.

Translated title of the contributionEnhancing Chinese Multi-Label Text Classification Performance with Response-based Knowledge Distillation
Original languageChinese (Traditional)
Title of host publicationROCLING 2022 - Proceedings of the 34th Conference on Computational Linguistics and Speech Processing
EditorsYung-Chun Chang, Yi-Chin Huang, Jheng-Long Wu, Ming-Hsiang Su, Hen-Hsen Huang, Yi-Fen Liu, Lung-Hao Lee, Chin-Hung Chou, Yuan-Fu Liao
PublisherThe Association for Computational Linguistics and Chinese Language Processing (ACLCLP)
Pages25-31
Number of pages7
ISBN (Electronic)9789869576956
StatePublished - 2022
Event34th Conference on Computational Linguistics and Speech Processing, ROCLING 2022 - Taipei, Taiwan
Duration: 21 Nov 202222 Nov 2022

Publication series

NameROCLING 2022 - Proceedings of the 34th Conference on Computational Linguistics and Speech Processing

Conference

Conference34th Conference on Computational Linguistics and Speech Processing, ROCLING 2022
Country/TerritoryTaiwan
CityTaipei
Period21/11/2222/11/22

Fingerprint

Dive into the research topics of 'Enhancing Chinese Multi-Label Text Classification Performance with Response-based Knowledge Distillation'. Together they form a unique fingerprint.

Cite this