Object Bounding Transformed Network for End-to-End Semantic Segmentation

Kuan Chung Wang, Chien Yao Wang, Tzu Chiang Tai, Jia Ching Wang

研究成果: 書貢獻/報告類型會議論文篇章同行評審

摘要

In recent years, numerous studies of the use of a Fully Convolutional Network (FCN) for image semantic segmentation have been published. This work introduces an end-to-end Object Bounding Transformed Network (OBTNet) which combines the advantages of the Object Boundary Guided (OBG) and Doman Transform (DT). OBG is an object boundary based approach that increases the integrity of object shape. Based on OBG, we propose an Object Boundary Network (OBN) as the object region and object boundary generator. In addition, our system achieves object region preserving and object boundary preserving by employing DT. The proposed system uses the pretrained multi-scale ResNet101 as the base network and uses multi-scale atrous convolution to preserve the dimensions of the feature map, increasing the accuracy of semantic segmentation. Experiments show that our system yielded a mean IOU of 77.74% and outperformed the baseline model on the VOC2012 test set.

原文???core.languages.en_GB???
主出版物標題2019 IEEE International Conference on Image Processing, ICIP 2019 - Proceedings
發行者IEEE Computer Society
頁面3217-3221
頁數5
ISBN(電子)9781538662496
DOIs
出版狀態已出版 - 9月 2019
事件26th IEEE International Conference on Image Processing, ICIP 2019 - Taipei, Taiwan
持續時間: 22 9月 201925 9月 2019

出版系列

名字Proceedings - International Conference on Image Processing, ICIP
2019-September
ISSN(列印)1522-4880

???event.eventtypes.event.conference???

???event.eventtypes.event.conference???26th IEEE International Conference on Image Processing, ICIP 2019
國家/地區Taiwan
城市Taipei
期間22/09/1925/09/19

指紋

深入研究「Object Bounding Transformed Network for End-to-End Semantic Segmentation」主題。共同形成了獨特的指紋。

引用此