Projects per year
Abstract
In this paper, we propose a knowledge infusion mechanism to incorporate domain knowledge into language transformers. Weakly supervised data is regarded as the main source for knowledge acquisition. We pre-train the language models to capture masked knowledge of focuses and aspects and then fine-tune them to obtain better performance on the downstream tasks. Due to the lack of publicly available datasets for multi-label classification of Chinese medical questions, we crawled questions from medical question/answer forums and manually annotated them using eight predefined classes: persons and organizations, symptom, cause, examination, disease, information, ingredient, and treatment. Finally, a total of 1,814 questions with 2,340 labels. Each question contains an average of 1.29 labels. We used Baidu Medical Encyclopedia as the knowledge resource. Two transformers BERT and RoBERTa were implemented to compare performance on our constructed datasets. Experimental results showed that our proposed model with knowledge infusion mechanism can achieve better performance, no matter which evaluation metric including Macro F1, Micro F1, Weighted F1 or Subset Accuracy were considered.
Original language | English |
---|---|
Title of host publication | ROCLING 2021 - Proceedings of the 33rd Conference on Computational Linguistics and Speech Processing |
Editors | Lung-Hao Lee, Chia-Hui Chang, Kuan-Yu Chen |
Publisher | The Association for Computational Linguistics and Chinese Language Processing (ACLCLP) |
Pages | 265-270 |
Number of pages | 6 |
ISBN (Electronic) | 9789869576949 |
State | Published - 2021 |
Event | 33rd Conference on Computational Linguistics and Speech Processing, ROCLING 2021 - Taoyuan, Taiwan Duration: 15 Oct 2021 → 16 Oct 2021 |
Publication series
Name | ROCLING 2021 - Proceedings of the 33rd Conference on Computational Linguistics and Speech Processing |
---|
Conference
Conference | 33rd Conference on Computational Linguistics and Speech Processing, ROCLING 2021 |
---|---|
Country/Territory | Taiwan |
City | Taoyuan |
Period | 15/10/21 → 16/10/21 |
Keywords
- Biomedical informatics
- Domain knowledge extraction
- Pretrained language models
- Text classification
Fingerprint
Dive into the research topics of 'Incorporating Domain Knowledge into Language Transformers for Multi-Label Classification of Chinese Medical Questions'. Together they form a unique fingerprint.Projects
- 1 Finished
-
Chinese Knowledge Base Construction and Applications for Medical Healthcare Domain(2/3)
Lee, L.-H. (PI)
1/05/20 → 30/04/21
Project: Research