智慧型照護互動系統-視障人士的智慧夥伴(4/4)

Project Details

Description

The goal of this project are to deliver an integrated system, tofacilitate visually impaired people in daily activities. The system willensure both safe outdoor activities and convenient indoor actions. UsingDeep Learning technologies, our team will design and implementnavigation devices, whether they are wearable or carried via a robot, tohelp visually impaired persons to travel, to recognize or find objects.There are about 180,000 visually impaired people in Taiwan. Their livingareas and communication/social are quite limited. Selfnavigation/traveling is a key factor for their dignity as well as theirjoining local communities. In the past, visually impaired people used touse white cane as an extension of his/her hand. In outdoor, guide dogsare commonly used. However, the cost of the guide dogs is veryexpensive. Mostly, blind dogs may not fully understand the requirementsfrom a blind. If the blind relies on assistants or his/her relatives,the cost could be very high too. Therefore, to design a system which canverbally communicate with the blind and can guide the blind in outdoorand indoor navigation by avoiding obstacles became very important.The project will develop a complete intelligent system includinghardware and firmware, as well as the software platform. The projectconsists of four sub tasks. Task one is in charge of the design anddevelopment of an automatic robot and the wearable device including thefunctions of GPS guide and safety on the walking. Task two use DeepLearning networks to help the blind to recognize street labels, streetscenes, and street events. By helping the blind to know where he/she islocated, and the surrounding information. Understanding street eventscan further help the blind to react faster and easily.In order to allow the blind to communicate with the robot guider, taskthree aims to develop a robot hearing system. The system is able toscreen noises and allow the blind to use verbal commands to control therobot remotely. The system can further allow open dialogue between theblind and the robot. We further investigates low power consumptiondesigns, and aim to deliver an optimized DRA chip, to enable thedevelopment of wearable devices.Indoor environment for blind can include living space, office space, andother non-familiar space (e.g., hospital, government office, shops,etc.). The fourth task will develop an intelligent indoor supportingsystem to help blinds. The system aims to provide two types ofinformation for living space and the other type of spaces, respectively.While the blind is in living space, he/she will know whether furnitureor objects were moved. While the blind is in other type of spaces, theywill know the locations of stairs, escalators, or elevators. Somewarning signs can be recognized.After the above four tasks are completed and the related systems aredeveloped, the project will implement algorithms on chips to reduce thescale of hardware component. As a result, the components can be smallenough and can be placed in a wearable device.
StatusFinished
Effective start/end date1/01/2131/12/21

UN Sustainable Development Goals

In 2015, UN member states agreed to 17 global Sustainable Development Goals (SDGs) to end poverty, protect the planet and ensure prosperity for all. This project contributes towards the following SDG(s):

  • SDG 9 - Industry, Innovation, and Infrastructure
  • SDG 12 - Responsible Consumption and Production
  • SDG 16 - Peace, Justice and Strong Institutions
  • SDG 17 - Partnerships for the Goals

Keywords

  • Deep learning,Robots
  • Wearable device、Street view reasoning
  • Artificial intelligence、Visually impaired supporting system、Machine hearing

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.