english only
EPFL > STI > IMT > LASA > Publications > Abstract

To fulfill a need for natural, user-friendly means of interacting and reprogramming toy and humanoid robots, a growing trend of robotics research investigates the integration of methods for gesture recognition and natural speech processing. Unfortunately, efficient methods for speech and vision processing remain computationally expensive and, thus, cannot be easily exploited on cost- and size-limited platforms. Personal Digital Assistants (PDAs) are ideal low-cost platforms to provide simple speech and vision-based communication for a robot.
This paper investigates the use of Personal Digital Assistant (PDA) interfaces to provide multi-modal means of interacting with humanoid robots. We present PDA applications in which the robot can track and imitate the user's arm and head motions, and can learn a simple vocabulary to label objects and actions by associating the user's verbal utterance with the user's gestures. The PDA applications are tested on two humanoid platforms: a mini doll-shaped robot, Robota, used as an educational toy with children, and DB, a full body 30 degrees of freedom humanoid robot.

Downloadable files: 0) { $tempFile = $row['pdfFile']; $temp = "pdf"; echo "[$temp] "; } // ps.Z if (strlen($row['psZFile'])>0) { $tempFile = $row['psZFile']; $temp = "ps.Z"; echo "[$temp] "; } // ps.gz if (strlen($row['psgzFile'])>0) { $tempFile = $row['psgzFile']; $temp = "ps.gz"; echo "[$temp] "; } ?>

Last update: 25/08/06