= PDF Reprint, = BibTeX entry, = Online Abstract
L. Itti, N. Dhavale, F. Pighin, Realistic Avatar Eye and Head Animation Using a Neurobiological Model of Visual Attention, In: Proc. SPIE 48th Annual International Symposium on Optical Science and Technology, (B. Bosacchi, D. B. Fogel, J. C. Bezdek Ed.), Vol. 5200, pp. 64-78, Bellingham, WA:SPIE Press, Aug 2003. (Cited by 283)
Abstract: We describe a neurobiological model of visual attention and eye/head movements in primates, and its application to the automatic animation of a realistic virtual human head watching an unconstrained variety of visual inputs. The bottom-up (image-based) attention model is based on the known neurophysiology of visual processing along the occipito-parietal pathway of the primate brain, while the eye/head movement model is derived from recordings in freely behaving Rhesus monkeys. The system is successful at autonomously saccading towards and tracking salient targets in a variety of video clips, including synthetic stimuli, real outdoors scenes and gaming console outputs. The resulting virtual human eye/head animation yields realistic rendering of the simulation results, both suggesting applicability of this approach to avatar animation and reinforcing the plausibility of the neural model.
Themes: Model of Bottom-Up Saliency-Based Visual Attention, Computational Modeling, Computer Vision
Copyright © 2000-2007 by the University of Southern California, iLab and Prof. Laurent Itti.
This page generated by bibTOhtml on Fri Jan 26 09:25:23 PST 2018