Robust tracking using visual cue integration for mobile mixed images

Hsiao Tzu Chen, Chih Wei Tang

Research output: Contribution to journalArticlepeer-review

2 Scopus citations


The transmitted scene superposed with the reflected scene from a transparent surface leads to mixed images. Few methods have been devoted for tracking on mixed images while such images are ubiquitous in the real world. Thus, this paper proposes a robust single object tracking scheme for mixed images acquired by mobile cameras. Layer separation that decomposes mixed images extracts intrinsic dynamic layers before tracking. In order to make the tracker robust against camera motion, motion compensation is applied to both layer separation and prediction stage of the particle filter. To maximize the observation likelihood and thus optimize particle weights in the face of reflections, the proposed scheme combines sequential importance resampling (SIR) based co-inference and maximum likelihood for multi-cue integration. Experimental results show that the proposed scheme effectively improves tracking accuracy on mixed images with camera motion.

Original languageEnglish
Pages (from-to)208-218
Number of pages11
JournalJournal of Visual Communication and Image Representation
StatePublished - 1 Jul 2015


  • Camera motion
  • Co-inference
  • Layer separation
  • Motion compensation
  • Multi-cue integration
  • Particle filter
  • Reflection
  • Visual tracking


Dive into the research topics of 'Robust tracking using visual cue integration for mobile mixed images'. Together they form a unique fingerprint.

Cite this