Anti-Aliasing Attention U-net Model for Skin Lesion Segmentation

Phuong Thi Le, Bach Tung Pham, Ching Chun Chang, Yi Chiung Hsu, Tzu Chiang Tai, Yung Hui Li, Jia Ching Wang

研究成果: 雜誌貢獻期刊論文同行評審

8 引文 斯高帕斯(Scopus)

摘要

The need for a lightweight and reliable segmentation algorithm is critical in various biomedical image-prediction applications. However, the limited quantity of data presents a significant challenge for image segmentation. Additionally, low image quality negatively impacts the efficiency of segmentation, and previous deep learning models for image segmentation require large parameters with hundreds of millions of computations, resulting in high costs and processing times. In this study, we introduce a new lightweight segmentation model, the mobile anti-aliasing attention u-net model (MAAU), which features both encoder and decoder paths. The encoder incorporates an anti-aliasing layer and convolutional blocks to reduce the spatial resolution of input images while avoiding shift equivariance. The decoder uses an attention block and decoder module to capture prominent features in each channel. To address data-related problems, we implemented data augmentation methods such as flip, rotation, shear, translate, and color distortions, which enhanced segmentation efficiency in the international Skin Image Collaboration (ISIC) 2018 and PH2 datasets. Our experimental results demonstrated that our approach had fewer parameters, only 4.2 million, while it outperformed various state-of-the-art segmentation methods.

原文???core.languages.en_GB???
文章編號1460
期刊Diagnostics
13
發行號8
DOIs
出版狀態已出版 - 4月 2023

指紋

深入研究「Anti-Aliasing Attention U-net Model for Skin Lesion Segmentation」主題。共同形成了獨特的指紋。

引用此