RNN-based Dialogue Navigation System for Visually Impaired

Ching Han Chen, Ming Fang Shiu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Helping the visually impaired to walk and guiding him to the destination is a challenging task. The difficulty is to use natural language as the only communication method and assist the visually impaired to see the road. We have developed a conversational navigation system that integrates multiple rounds of dialogue to confirm the destination for the visually impaired. Besides, we also use image interpretation technology based on the RNN neural network to describe the scene in front of the user. Finally, in terms of hardware, we use a low-cost, low- power embedded hardware that integrates cameras, Wi-Fi, and microphones to implement this application.

Original languageEnglish
Title of host publicationProceedings - 2020 International Conference on Pervasive Artificial Intelligence, ICPAI 2020
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages140-143
Number of pages4
ISBN (Electronic)9781665404839
DOIs
StatePublished - Dec 2020
Event1st International Conference on Pervasive Artificial Intelligence, ICPAI 2020 - Taipei, Taiwan
Duration: 3 Dec 20205 Dec 2020

Publication series

NameProceedings - 2020 International Conference on Pervasive Artificial Intelligence, ICPAI 2020

Conference

Conference1st International Conference on Pervasive Artificial Intelligence, ICPAI 2020
Country/TerritoryTaiwan
CityTaipei
Period3/12/205/12/20

Keywords

  • dialogue system
  • embedded hardware
  • image captioning
  • navigation
  • visually impaired

Fingerprint

Dive into the research topics of 'RNN-based Dialogue Navigation System for Visually Impaired'. Together they form a unique fingerprint.

Cite this