Abstract

We present a vision-based navigation and localization system using two biologically-inspired scene understanding models which are studied from human visual capabilities:

(1) Gist model which captures the holistic characteristics and layout of an image and
(2) Saliency model which emulates the visual attention of primates to identify conspicuous regions in the image.

Here the localization system utilizes the gist features and salient regions to accurately localize the robot, while the navigation system uses the salient regions to perform visual feedback control to direct its heading and go to a user-provided goal location.

We tested the system on our robot, Beobot2.0, in an indoor and outdoor environment with a route length of 36.67m (10,890 video frames) and 138.27m (28,971 frames), respectively. On average, the robot is able to drive within 3.68cm and 8.78cm (respectively) of the center of the lane.

Papers

Source Codes

The code is integrated to the iLab Neuromorphic Vision C++ Toolkit. In order to gain code access, please follow the download instructions there.

The code you want is in saliency/src/Robot/Beobot2/Navigation/GistSal_Navigation/GistSal_Navigation.C
To compile the code: make bin/app-GistSal_Navigation
To run the code, in the saliency folder, run the command:

./bin/app-GistSal_Navigation ../data/HNBk2/HNBbasement.env --ice-identity=N --out=display

Other options
--out=video.mpg specifies prefix and the file type as video

--icestorm-ip=192.168.0.xx specifies ICE server ip address


Introduction


Design and Implementation

The core of our present research focuses on the utilizes gist and saliency features for navigation and localization purpose.

Please see the iLab Neuromorphic Vision C++ Toolkit for all the source code.

Localization System

Navigation System

Salient Region Tracker

Forward Projection

Lateral Difference Estimation



Testing

Indoor Testing (HNB): Total 4704 frames
320x240: mpg(16.8 MB) or RAW PPM Image(679.7 MB)
160x120: mpg(8.7 MB) or RAW PNM Image(213.8 MB)

For outdoor localization only testing, we have original dataset.

Results


Indoor : Hedco Neuroscience Building (HNB)

A video of a test run for Hedco Neuroscience Building

Outdoor : Engineering Quad (Equad)

Discussion

Conclusion


Copyright © 2010 by the University of Southern California, iLab and Prof. Laurent Itti