Outdoor walking guide for the visually-impaired people based on semantic segmentation and depth map

I. Hsuan Hsieh, Hsiao Chu Cheng, Hao Hsiang Ke, Hsiang Chieh Chen, Wen June Wang

研究成果: 書貢獻/報告類型會議論文篇章同行評審

6 引文 斯高帕斯(Scopus)

摘要

In this study, we proposed a wearable guiding system, which contains an embedded system-Jetson AGX Xavier launched by Nvidia and a RGB-D binocular depth camera-Stereolabs ZED2, for guiding visually-impaired people to walk outdoors. Using the deep learning image segmentation model and the depth map obtained by the ZED2, the front image of the blind is divided into seven divisions. Each division has its confidence of walkability which is computed by our specific methods. Based on the confidence of walkability, the most suitable direction for the visually-impaired people is selected and voice prompts are played to lead the visually-impaired people walking forward on the sidewalk or walking crosswalk to cross the road safely. An experiment is performed to verify the effectiveness of the proposed system.

原文???core.languages.en_GB???
主出版物標題Proceedings - 2020 International Conference on Pervasive Artificial Intelligence, ICPAI 2020
發行者Institute of Electrical and Electronics Engineers Inc.
頁面144-147
頁數4
ISBN(電子)9781665404839
DOIs
出版狀態已出版 - 12月 2020
事件1st International Conference on Pervasive Artificial Intelligence, ICPAI 2020 - Taipei, Taiwan
持續時間: 3 12月 20205 12月 2020

出版系列

名字Proceedings - 2020 International Conference on Pervasive Artificial Intelligence, ICPAI 2020

???event.eventtypes.event.conference???

???event.eventtypes.event.conference???1st International Conference on Pervasive Artificial Intelligence, ICPAI 2020
國家/地區Taiwan
城市Taipei
期間3/12/205/12/20

指紋

深入研究「Outdoor walking guide for the visually-impaired people based on semantic segmentation and depth map」主題。共同形成了獨特的指紋。

引用此