Incorporating Domain Knowledge into Language Transformers for Multi-Label Classification of Chinese Medical Questions

Po Han Chen, Yu Xiang Zeng, Lung Hao Lee

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

In this paper, we propose a knowledge infusion mechanism to incorporate domain knowledge into language transformers. Weakly supervised data is regarded as the main source for knowledge acquisition. We pre-train the language models to capture masked knowledge of focuses and aspects and then fine-tune them to obtain better performance on the downstream tasks. Due to the lack of publicly available datasets for multi-label classification of Chinese medical questions, we crawled questions from medical question/answer forums and manually annotated them using eight predefined classes: persons and organizations, symptom, cause, examination, disease, information, ingredient, and treatment. Finally, a total of 1,814 questions with 2,340 labels. Each question contains an average of 1.29 labels. We used Baidu Medical Encyclopedia as the knowledge resource. Two transformers BERT and RoBERTa were implemented to compare performance on our constructed datasets. Experimental results showed that our proposed model with knowledge infusion mechanism can achieve better performance, no matter which evaluation metric including Macro F1, Micro F1, Weighted F1 or Subset Accuracy were considered.

Original languageEnglish
Title of host publicationROCLING 2021 - Proceedings of the 33rd Conference on Computational Linguistics and Speech Processing
EditorsLung-Hao Lee, Chia-Hui Chang, Kuan-Yu Chen
PublisherThe Association for Computational Linguistics and Chinese Language Processing (ACLCLP)
Pages265-270
Number of pages6
ISBN (Electronic)9789869576949
StatePublished - 2021
Event33rd Conference on Computational Linguistics and Speech Processing, ROCLING 2021 - Taoyuan, Taiwan
Duration: 15 Oct 202116 Oct 2021

Publication series

NameROCLING 2021 - Proceedings of the 33rd Conference on Computational Linguistics and Speech Processing

Conference

Conference33rd Conference on Computational Linguistics and Speech Processing, ROCLING 2021
Country/TerritoryTaiwan
CityTaoyuan
Period15/10/2116/10/21

Keywords

  • Biomedical informatics
  • Domain knowledge extraction
  • Pretrained language models
  • Text classification

Fingerprint

Dive into the research topics of 'Incorporating Domain Knowledge into Language Transformers for Multi-Label Classification of Chinese Medical Questions'. Together they form a unique fingerprint.

Cite this