In recent years, Lidar sensors have been widely used in the SLAM system of cleaning robots, but Lidar usually only provides planar environmental scanning, and lacks height information. When the sweeping robot enters a certain space area, it may not be able to leave the space due to height restrictions. Although the use of 3D vision can solve the problem of space obstacles, its high cost and computational complexity make it difficult to apply to consumer sweeping robot products, and it still needs to improve its accuracy and reliability. This project will develop a dual-line laser 3D ranging method applied to sweeping robots, through two two line Lasers and a global exposure image sensor to build a 3D space model of the environment. We will implement a robot platform, acquire the robot motion path information through the motion encoder and IMU sensing information, and integrate it with the dual-line laser imaging 3D coordinates. We use the A-Star and DWA fusion algorithms for robot path planning, and complete a 2D Lidar ranging system and 3D SLAM system. Finally, we will use the sweeping robot experimental platform to verify that this system has higher efficiency, accuracy and obstacle avoidance than the market sweeping robots, and can be applied to a new generation of cleaning robots and a wide range of autonomous robot products.
|Effective start/end date||1/11/20 → 31/10/21|
UN Sustainable Development Goals
In 2015, UN member states agreed to 17 global Sustainable Development Goals (SDGs) to end poverty, protect the planet and ensure prosperity for all. This project contributes towards the following SDG(s):
- Line Laser
- Vision Sensor
- 3D Inspection
- Cleaning Robot
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.