Specific Expert Learning: Enriching Ensemble Diversity via Knowledge Distillation

Wei Cheng Kao, Hong Xia Xie, Chih Yang Lin, Wen Huang Cheng

研究成果: 雜誌貢獻期刊論文同行評審

5 引文 斯高帕斯(Scopus)

摘要

In recent years, ensemble methods have shown sterling performance and gained popularity in visual tasks. However, the performance of an ensemble is limited by the paucity of diversity among the models. Thus, to enrich the diversity of the ensemble, we present the distillation approach - learning from experts (LFEs). Such method involves a novel knowledge distillation (KD) method that we present, specific expert learning (SEL), which can reduce class selectivity and improve the performance on specific weaker classes and overall accuracy. Through SEL, models can acquire different knowledge from distinct networks with various areas of expertise, and a highly diverse ensemble can be obtained afterward. Our experimental results demonstrate that, on CIFAR-10, the accuracy of the ResNet-32 increases 0.91% with SEL, and that the ensemble trained by SEL increases accuracy by 1.13%. Compared to state-of-the-art approaches, for example, DML only improves accuracy by 0.3% and 1.02% on single ResNet-32 and the ensemble, respectively. Furthermore, our proposed architecture also can be applied to ensemble distillation (ED), which applies KD on the ensemble model. In conclusion, our experimental results show that our proposed SEL not only improves the accuracy of a single classifier but also boosts the diversity of the ensemble model.

原文???core.languages.en_GB???
頁(從 - 到)2494-2505
頁數12
期刊IEEE Transactions on Cybernetics
53
發行號4
DOIs
出版狀態已出版 - 1 4月 2023

指紋

深入研究「Specific Expert Learning: Enriching Ensemble Diversity via Knowledge Distillation」主題。共同形成了獨特的指紋。

引用此