{{youtube>4C4c7eg6WTY?960x540|Beobot 2.0}} The Beobot 2.0 project is launched to create an integrated and embodied Artificial Intelligence system that can operate in large scale unconstrained environments (such as our college campus) and work alongside humans. Our main contributions are both in hardware and software. For the former, we provide all available information about the hardware robotics construction, both the mechanical CAD Solidworks files as well as the electronics design PCB boards and list of components, which can be accessed in sections for [[Beobot_2.0/Mechanical_System |mechanical system]] and [[Beobot_2.0/Electrical_System| electrical system]], respectively. We created a human-sized high-performing parallel computing mobile robot platform utilizing miniature-sized Computer on Modules (COM). The COM Express module is a small 12.5x9.5cm form-factor embedded system that are equivalent to a regular desktop computer. We use a module with a 2.2GHz Intel Core2-Duo processor. In our robot we will have 8 of them to total 16 processing cores. As fas as we know, we believe this is the most powerful robot for its size. The estimated total cost of the robot is **$24923.29** (mechanical system cost: $4646.69, and electrical system: $20276.6, not including shipping costs), which is far below what a robot of this capability goes for in the market. We estimate that, given the provided instructions and design files (no re-designs), the **construction time is about 2 month**. This robot is an improvement from the previous version [[http://ilab.usc.edu/beobots/ | Beobot 1]], where it has 4 cores of 1GHz each. In addition Beobot1 is also smaller in size as it uses a remote control (RC) car as a platform. Using the available processing, we aim to create a general scene understanding system, where the robot can autonomously localize and navigate, recognize target objects and people, and even provide help whenever needed. To that end, we are developing a framework where these individual capabilities are contextualized to enable more robust real time system. Our code can be freely downloaded from the [[http://ilab.usc.edu/toolkit/ | Vision Toolkit]]. Some videos of our past results and testings can be found on our **[[https://www.youtube.com/watch?v=zqIntIr9FFg&list=PLC5FD03FF39B34E4B | youtube video list]]**. We also use the same technology for a [[http://ilab.usc.edu/visualaid |visual aid device]] for the blind. ====People==== ^ ^ Name ^ Task ^ |1. | [[http://ilab.usc.edu/siagian/ | Christian Siagian]] | Mechanical, electrical, and software | |2. | [[http://ilab.usc.edu/~kai/| Chin-Kai Chang]] | Mechanical, electrical, and software | |2. | [[http://www.linkedin.com/pub/1/4b0/a72 | Randolph Voorhies]] | Mechanical, electrical, and software | |4. | Dicky Nauli Sihite | Mechanical, electrical, and software | |5. | Manu Viswanathan | Software | |6. | [[http://ilab.usc.edu/ | Laurent Itti]] | Mechanical, electrical, and software | ====User's Manual==== The user's manual can be found [[Beobot_2.0/Users_Manual |here]] when it is ready for publishing. ====Design and Implementations==== The project has both hardware (mechanical and electrical) and software components: ===Mechanical System=== The [[Beobot_2.0/Mechanical_System |mechanical system]] consists of locomotion, battery, cooling, and computer protection system. The section discusses the design decisions, part manufacturing and assembly, as well as testing. ===Electrical System=== The [[Beobot_2.0/Electrical_System| electrical system]] consists of processors, power, and peripherals. It discusses the component selection, board design, connections, interfaces, and power management. ===Software=== The [[Beobot_2.0/System |software system]] describes our mobile robotic architecture, which focuses on problems such as localization, navigation, human-robot interaction, and object recognition. ====Links (Related Robot Projects)==== * [[http://www.willowgarage.com/pages/pr2/overview | PR2]]: Willow Garage (and various top universities): indoor personal robot: manipulation, HRI, navigation, task planning. * [[http://stair.stanford.edu/ | STAIR Robot]] from Stanford: object manipulation (door opening, picking up objects), object recognition, indoor navigation * [[http://his.anthropomatik.kit.edu/english/241.php | KIT humanoid]]: from Karlsruhe Institute of Technology: humanoid, manipulation, learning. * [[http://www.lsr.ei.tum.de/research/research-areas/robotics/ace-the-autonomous-city-explorer-project/ | Autonomous City Explorer]] project from TU Munchen: outdoor navigation (not localization), and Human Robot Interaction * [[http://roboticslab.uc3m.es/roboticslab/robot.php?id_robot=1 | Maggie]]: from Robotics Lab in University Carlos III of Madrid: social interaction, indoor navigation and localization, object recognition. * [[http://introlab.gel.usherbrooke.ca/mediawiki-introlab/index.php/Main_Page | Introlab]]: from University of Sherbrooke (Canada): interaction, mobility, etc, but unclear if it does fully consolidated system. * [[http://pr.cs.cornell.edu/videos.php | Polar]]: from Cornell: housekeeping robot: object recognition, manipulation. * [[http://www.asl.ethz.ch/robots/crab | crab]]: from ETH: mechanical mobility, navigation * [[http://www.roboticopenplatform.org/wiki/AMIGO AMIGO] Robotic Open Platform Autonomous Wheelchairs: to help disabled people (mostly indoors) * [[http://autonomos.inf.fu-berlin.de/ | AUTONOMOS]] wheelchair control using voice, eyetracking, brain activity (EEG). Partner with [[http://userpage.fu-berlin.de/~latotzky/wheelchair/ | Freie Universität Berlin]]