Analysis of Coordination Patterns between Gaze and Control in Human Spatial Search

Kuo Shih Tseng, Bérénice Mettler

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Human spatial search combines visual search and motion control problems. Both have been investigated separately over decades, however, the coordination between visual search and motion control has not been investigated. Analyzing coordination of sensory-motor behavior through teleoperation could help improve understanding of human search strategies as well as autonomous search algorithms. This research proposes a novel approach to analyze the coordination between visual attention via gaze patterns and motion control. The approach is based on estimation of human operators’ 3D gaze using Gaussian mixture model (GMM), hidden Markov model (HMM) and sparse inverse covariance estimation (SICE). The analysis of the human experimental data demonstrates that fixation is used primarily to look at the target, smooth pursuit is coupled to robot rotation and used to search for new explored area, and saccade is coupled with forward motion and used to search for new explored area. These insights are used to build a functional model of human teleoperation search.

Original languageEnglish
Pages (from-to)264-271
Number of pages8
JournalIFAC-PapersOnLine
Volume51
Issue number34
DOIs
StatePublished - 1 Jan 2019

Keywords

  • Autonomous mobile robots
  • Human-machine interface
  • Information analysis
  • Machine learning
  • Telerobotics

Fingerprint

Dive into the research topics of 'Analysis of Coordination Patterns between Gaze and Control in Human Spatial Search'. Together they form a unique fingerprint.

Cite this