= PDF Reprint, = BibTeX entry, = Online Abstract
L. Itti, N. Dhavale, F. Pighin, Photorealistic Attention-Based Gaze Animation, In: Proc. IEEE International Conference on Multimedia and Expo, pp. 1-4, Jul 2006. (Cited by 19)
Abstract: We apply a neurobiological model of visual attention and gaze control to the automatic animation of a photorealistic virtual human head. The attention model simulates biological visual processing along the occipito-parietal pathway of the primate brain. The gaze control model is derived from motion capture of human subjects, using high-speed video-based eye and head tracking apparatus. Given an arbitrary video clip, the model predicts visual locations most likely to attract an observer's attention, and simulates the dynamics of eye and head movements towards these locations. Tested on 85 video clips including synthetic stimuli, video games, TV news, sports, and outdoor scenes, the model demonstrates a strong ability at saccading towards and tracking salient targets. The resulting autonomous virtual human animation is of photorealistic quality.
Themes: Computational Modeling, Model of Bottom-Up Saliency-Based Visual Attention
Copyright © 2000-2007 by the University of Southern California, iLab and Prof. Laurent Itti.
This page generated by bibTOhtml on Wed Feb 15 12:13:56 PST 2017