Reinforced Cascading Convolutional Neural Networks and Vision Transformer for Lung Disease Diagnosis

Fityanul Akhyar, Ledya Novamizanti, Raihan Arfi Maulana, Chi Wen Lung, Chih Yang Lin

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Lung diseases are among the most deadly infectious diseases worldwide. Covid-19 infection is a current disease that falls within this category and has impacted public health in countries across the globe. Accordingly, this study focuses on building a lung disease identification system using a state-of-the-art deep cascade learning classification model, EfficientNet-Vision Transformer. The proposed Real ESRGAN is utilized to enhance the input of EfficientNet, while image Relative Position Encoding (iRPE) is added to improve the attention of the transformer network. Moreover, weight balancing is applied to stabilize the performance of the proposed system. When trained on the X-Ray dataset, our model achieved 93.757% accuracy on five classes of lung disease: Normal, Covid-19, Viral Pneumonia, Bacterial Pneumonia, and Tuberculosis.

Original languageEnglish
Title of host publicationProceedings - 2022 IEEE International Conference on Consumer Electronics - Taiwan, ICCE-Taiwan 2022
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages201-202
Number of pages2
ISBN (Electronic)9781665470506
DOIs
StatePublished - 2022
Event2022 IEEE International Conference on Consumer Electronics - Taiwan, ICCE-Taiwan 2022 - Taipei, Taiwan
Duration: 6 Jul 20228 Jul 2022

Publication series

NameProceedings - 2022 IEEE International Conference on Consumer Electronics - Taiwan, ICCE-Taiwan 2022

Conference

Conference2022 IEEE International Conference on Consumer Electronics - Taiwan, ICCE-Taiwan 2022
Country/TerritoryTaiwan
CityTaipei
Period6/07/228/07/22

Fingerprint

Dive into the research topics of 'Reinforced Cascading Convolutional Neural Networks and Vision Transformer for Lung Disease Diagnosis'. Together they form a unique fingerprint.

Cite this