This is an old revision of the document!


The Beobot 2.0 project is launched to create an integrated and embodied Artificial Intelligence system. Our main contribution here is that we provide all available information about our hardware robotics construction (in the mechanical CAD Solidworks files, as well as the electronics design PCB boards and list of components) as well as the robotics software, freely downloaded from the Vision Toolkit.

Our hope is that by making this information available, more research labs would start building fully functioning robots with various capabilities.

On the hardware side, we created a human-sized high-performing parallel computing mobile robot platform utilizing miniature-sized Computer on Modules (COM). The COM Express module is a small 12.5×9.5cm form-factor embedded system that are equivalent to a regular desktop computer. We use a module with a 2.2GHz Intel Core2-Duo processor. In our robot we will have 8 of them to total 16 processing cores. As fas as we know, we believe this is the most powerful robot for its size. In addition, it is important to note that the estimated total cost of the robot is $24923.29 (mechanical system cost: $4646.69, and electrical system: $20276.6, not including shipping costs) far below what a robot of this capability goes for in the market. We estimate that, given all the provided instructions and design files, the amount of time to assemble (no re-designs), construction time is about 2 month.

And now, given the available processing, the software goal of our system is to create a general vision architecture where the robot are able to self localize, recognize objects, people, and faces. Furthermore, in our system, we would like to emphasize that all these capabilities will be implemented in a unified architecture. This way all available information obtained from all modules in the previous frames are available as context to every module that is currently running. In addition, we plan to fit Beobot 2.0 with a Scorbot Robot Arm to enable it to perform object manipulations.

This robot is an improvement from the previous version Beobot 1, where it has 4 cores of 1GHz each. In addition Beobot1 is also smaller in size as it uses a remote control (RC) car as a platform.

People

Name Task
1. Christian Siagian Mechanical, electrical, and software
2. Chin-Kai Chang Mechanical, electrical, and software
2. Randolph Voorhies Mechanical, electrical, and software
4. Dicky Nauli Sihite Mechanical, electrical, and software
5. Manu Viswanathan Software
6. Laurent Itti Mechanical, electrical, and software

User's Manual

The user's manual can be found here.

Design and Implementations

The system has two parts: software and hardware. The hardware part has two sub-systems: the mechanical and electrical system. The integration issues between these two sub-systems usually pertains to: * Making sure size (length, width, and height) of boards are accomodated. * Specifying connectors and their placements so that they are easily reachable. * Cable placements.

As for the software system, we discuss the unified vision architecture and its capabilities below.

Mechanical System

The mechanical system consists of the following components: locomotion, battery, cooling, and computer protection system.

This section discusses the design decisions, part manufacturing and assembly, as well as testing.

Electrical System

The electrical system consists of the following components: processors, power, and peripherals.

This section discusses the component selection, board design, connections, interfaces, and power management.

Software System

The software system describes our mobile robotic architecture, which focuses on problems such as localization, navigation, human-robot interaction, and object recognition. Our goal is to have the robot be able to autonomously move about our college campus, while being able to recognize people and objects, to identify whether a person needs help, and hopefully be able to help him/her. The section also includes firmware level issues such as low level computer communication.

  • PR2: Willow Garage (and various top universities): indoor personal robot: manipulation, HRI, navigation, task planning.
  • STAIR Robot from Stanford: object manipulation (door opening, picking up objects), object recognition, indoor navigation
  • KIT humanoid: from Karlsruhe Institute of Technology: humanoid, manipulation, learning.
  • Autonomous City Explorer project from TU Munchen: outdoor navigation (not localization), and Human Robot Interaction
  • Maggie: from Robotics Lab in University Carlos III of Madrid: social interaction, indoor navigation and localization, object recognition.
  • Introlab: from University of Sherbrooke (Canada): interaction, mobility, etc, but unclear if it does fully consolidated system.
  • Polar: from Cornell: housekeeping robot: object recognition, manipulation.
  • crab: from ETH: mechanical mobility, navigation
  • AUTONOMOS wheelchair control using voice, eyetracking, brain activity (EEG). Partner with Freie Universität Berlin

Navigation
QR Code
QR Code index (generated for current page)