Abstract
Human spatial search combines visual search and motion control problems. Both have been investigated separately over decades, however, the coordination between visual search and motion control has not been investigated. Analyzing coordination of sensory-motor behavior through teleoperation could help improve understanding of human search strategies as well as autonomous search algorithms. This research proposes a novel approach to analyze the coordination between visual attention via gaze patterns and motion control. The approach is based on estimation of human operators’ 3D gaze using Gaussian mixture model (GMM), hidden Markov model (HMM) and sparse inverse covariance estimation (SICE). The analysis of the human experimental data demonstrates that fixation is used primarily to look at the target, smooth pursuit is coupled to robot rotation and used to search for new explored area, and saccade is coupled with forward motion and used to search for new explored area. These insights are used to build a functional model of human teleoperation search.
Original language | English |
---|---|
Pages (from-to) | 264-271 |
Number of pages | 8 |
Journal | IFAC-PapersOnLine |
Volume | 51 |
Issue number | 34 |
DOIs | |
State | Published - 1 Jan 2019 |
Keywords
- Autonomous mobile robots
- Human-machine interface
- Information analysis
- Machine learning
- Telerobotics