= PDF Reprint, = BibTeX entry, = Online Abstract
J. Windau, L. Itti, Situation awareness via sensor-equipped eyeglasses, In: Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 5674-5679, Nov 2013. [2013 acceptance rate: 43%] (Cited by 9)
Abstract: New smartphone technologies are emerging which combine head-mounted displays (HMD) with standard functions such as receiving phone calls, emails, and helping with navigation. This opens new opportunities to explore cyber robotics algorithms (robotics sensors and human motor plant). To make these devices more adaptive to the environmental conditions, user behavior, and user preferences, it is important to allow the sensor-equipped devices to efficiently adapt and respond to user activities (e.g., disable incoming phone calls in an elevator, activate video recording while car driving). This paper hence presents a situation awareness system (SAS) for head-mounted smartphones. After collecting data from inertial sensors (accelerometers, gyroscopes), and video data (camera), SAS performs activity classification in three steps. Step 1 transforms inertial sensor data into a head orientation-independent and stable normalized coordinate system. Step 2 extracts critical features (statistical, physical, GIST). Step 3 classifies activities (Naive Bayes classifier), distinguishes between environments (Support Vector Machine), and finally combines both results (Hidden Markov Model) for further improvement. SAS has been implemented on a sensor-equipped eyeglasses prototype and achieved high accuracy (81.5%) when distinguishing between 20 real-world activities.
Themes: Scene Understanding
Copyright © 2000-2007 by the University of Southern California, iLab and Prof. Laurent Itti.
This page generated by bibTOhtml on Fri Jan 26 09:25:23 PST 2018