401px491px

The Beobot 2.0 project is launched to create an integrated and embodied Artificial Intelligence system. Our main contribution here is that we provide all available information about our hardware robotics construction (in the mechanical CAD Solidworks files, as well as the electronics design PCB boards and list of components) as well as the robotics software, freely downloaded from the [http://ilab.usc.edu/toolkit/ Vision Toolkit].

Our hope is that by making this information available, more research labs would start building fully functioning robots with various capabilities.

On the hardware side, we created a human-sized high-performing parallel computing mobile robot platform utilizing miniature-sized Computer on Modules (COM). The COM Express module is a small 12.5×9.5cm form-factor embedded system that are equivalent to a regular desktop computer. We use a module with a 2.2GHz Intel Core2-Duo processor. In our robot we will have 8 of them to total 16 processing cores. As fas as we know, we believe this is the most powerful robot for its size. In addition, it is important to note that the estimated total cost of the robot is '$24923.29' (mechanical system cost: $4646.69, and electrical system: $20276.6, not including shipping costs) far below what a robot of this capability goes for in the market. We estimate that, given all the provided instructions and design files, the amount of time to assemble (no re-designs), 'construction time is about 2 month'.

And now, given the available processing, the software goal of our system is to create a general vision architecture where the robot are able to self localize, recognize objects, people, and faces. Furthermore, in our system, we would like to emphasize that all these capabilities will be implemented in a unified architecture. This way all available information obtained from all modules in the previous frames are available as context to every module that is currently running. In addition, we plan to fit Beobot 2.0 with a Scorbot Robot Arm to enable it to perform object manipulations.

This robot is an improvement from the previous version [http://ilab.usc.edu/beobots/ Beobot 1], where it has 4 cores of 1GHz each. In addition Beobot1 is also smaller in size as it uses a remote control (RC) car as a platform.

People
1. [http://ilab.usc.edu/siagian/ Christian Siagian] Mechanical, electrical, and software.
2. [http://ilab.usc.edu/~kai/ Kai Chang] Mechanical, electrical, and software.
2. [http://www.linkedin.com/pub/1/4b0/a72 Randolph Voorhies] Mechanical, electrical, and software.
4. Dicky Nauli Sihite Mechanical, electrical, and software.
4. Manu Viswanathan Software.
5. [http://ilab.usc.edu/ Laurent Itti] Mechanical, electrical, and software.
User's Manual

The user's manual can be found here.

Design and Implementations

The system has two parts: software and hardware. The hardware part has two sub-systems: the mechanical and electrical system. The integration issues between these two sub-systems usually pertains to: * Making sure size (length, width, and height) of boards are accomodated. * Specifying connectors and their placements so that they are easily reachable. * Cable placements.

As for the software system, we discuss the unified vision architecture and its capabilities below.

Mechanical System

The mechanical system consists of the following components: locomotion, battery, cooling, and computer protection system.

This section discusses the design decisions, part manufacturing and assembly, as well as testing.

Electrical System

The electrical system consists of the following components: processors, power, and peripherals.

This section discusses the component selection, board design, connections, interfaces, and power management.

Software System

The software system describes our mobile robotic architecture, which focuses on problems such as localization, navigation, human-robot interaction, and object recognition. Our goal is to have the robot be able to autonomously move about our college campus, while being able to recognize people and objects, to identify whether a person needs help, and hopefully be able to help him/her. The section also includes firmware level issues such as low level computer communication.

* [http://www.willowgarage.com/pages/pr2/overview PR2]: Willow Garage (and various top universities): indoor personal robot: manipulation, HRI, navigation, task planning. * [http://stair.stanford.edu/ STAIR Robot] from Stanford: object manipulation (door opening, picking up objects), object recognition, indoor navigation * [http://his.anthropomatik.kit.edu/english/241.php KIT humanoid]: from Karlsruhe Institute of Technology: humanoid, manipulation, learning. * [http://www.lsr.ei.tum.de/research/research-areas/robotics/ace-the-autonomous-city-explorer-project/ Autonomous City Explorer] project from TU Munchen: outdoor navigation (not localization), and Human Robot Interaction * [http://roboticslab.uc3m.es/roboticslab/robot.php?id_robot=1 Maggie]: from Robotics Lab in University Carlos III of Madrid: social interaction, indoor navigation and localization, object recognition. * [http://introlab.gel.usherbrooke.ca/mediawiki-introlab/index.php/Main_Page Introlab]: from University of Sherbrooke (Canada): interaction, mobility, etc, but unclear if it does fully consolidated system. * [http://pr.cs.cornell.edu/videos.php Polar]: from Cornell: housekeeping robot: object recognition, manipulation. * [http://www.asl.ethz.ch/robots/crab crab]: from ETH: mechanical mobility, navigation * [http://www.roboticopenplatform.org/wiki/AMIGO AMIGO] Robotic Open Platform

Autonomous Wheelchairs: to help disabled people (mostly indoors) * [http://autonomos.inf.fu-berlin.de/ AUTONOMOS] wheelchair control using voice, eyetracking, brain activity (EEG). Partner with [http://userpage.fu-berlin.de/~latotzky/wheelchair/ Freie Universität Berlin]

beobot


Navigation
QR Code
QR Code start (generated for current page)