Abstract


= PDF Reprint,     = BibTeX entry,     = Online Abstract


Click to download PDF version Click to download BibTeX data Clik to view abstract A. Borji, S. Frintrop, D. N. Sihite, L. Itti, Adaptive Object Tracking by Learning Background Context, In: Proc. IEEE CVPR 2012, Egocentric Vision workshop, Providence, Rhode Island, pp. 1-8, Jun 2012. (Cited by 169)

Abstract: One challenge when tracking objects is to adapt the object representation depending on the scene context to account for changes in illumination, coloring, scaling, etc. Here, we present a solution that is based on our earlier approach for object tracking using particle filters and component-based descriptors. We extend the approach to deal with changing backgrounds by using a quick training phase with user interaction at the beginning of an image sequence. During this phase, some background clusters are learned along with object representations for those clusters. Next, for the rest of the sequence the best fitting background cluster is determined for each frame and the corresponding object representation is used for tracking. Experiments show a particle filter adapting to background changes can efficiently track objects and persons in natural scenes and results in higher tracking results than the basic approach. Additionally, using an object tracker to follow the main character in video games, we were able to explain a large amount of eye fixations higher than other saliency models in terms of NSS score proving that tracking is an important top-down attention component.

Themes: Model of Bottom-Up Saliency-Based Visual Attention, Model of Top-Down Attentional Modulation, Computational Modeling, Computer Vision

 

Copyright © 2000-2007 by the University of Southern California, iLab and Prof. Laurent Itti.
This page generated by bibTOhtml on Tue 09 Jan 2024 12:10:23 PM PST