Abstract


= PDF Reprint,     = BibTeX entry,     = Online Abstract


Click to download BibTeX data Clik to view abstract L. Itti, Neuromorphic Attentional Selection for Efficient Allocation of Computing Resources, In: Proc. Virtual Worlds and Simulation Conference, San Antonio, Texas, Jan 2002. (Cited by 17)

Abstract: When confronted with cluttered natural environments, animals still perform orders of magnitude better than artificial vision systems in tasks such as orienting, target detection, navigation and scene understanding. To better understand biological visual processing, we have developed a neuromorphic model of how our visual attention is attracted towards conspicuous locations in a visual scene. It replicates processing in the dorsal (``where'') visual stream in the primate brain. The model includes a bottom-up (image-based) computation of low-level color, intensity, orientation and motion features, as well as a non-linear spatial competition that enhances salient locations in each feature channel. All feature channels feed into a unique scalar ``saliency map'' which controls where to next focus attention. We show how our simple within-feature competition for salience effectively suppresses strong but spatially widespread feature responses due to clutter. The model robustly detects salient targets in live outdoors video streams, despite large variations in illumination, clutter, and rapid egomotion. The success of this approach suggests that neuromorphic vision algorithms may prove unusually robust for outdoors vision applications. Further, we argue that the massively parallel attentional selection implemented in our model may represent an efficient approach to the general problem of allocating limited computational resources under conditions of sensory overload.

Themes: Computational Modeling, Model of Bottom-Up Saliency-Based Visual Attention, Computer Vision

 

Copyright © 2000-2007 by the University of Southern California, iLab and Prof. Laurent Itti.
This page generated by bibTOhtml on Tue 09 Jan 2024 12:10:23 PM PST