Autor: Eric Maël, Stefan Zadel, Rolf P. Würtz, Mike Pagel, Mark Becker, Efthimia Kefalea, Jan C. Vorbrüggen, Christoph von der Malsburg, Jochen Triesch
Rok vydání: 1999
Předmět:
Zdroj: Autonomous Robots. 6:203-221
ISSN: 0929-5593
DOI: 10.1023/a:1008839628783
Popis: We have designed a research platform for a perceptually guided robot, which also serves as a demonstrator for a coming generation of service robots. In order to operate semi-autonomously, these require a capacity for learning about their environment and tasks, and will have to interact directly with their human operators. Thus, they must be supplied with skills in the fields of human-computer interaction, vision, and manipulation. GripSee is able to autonomously grasp and manipulate objects on a table in front of it. The choice of object, the grip to be used, and the desired final position are indicated by an operator using hand gestures. Grasping is performed similar to human behavior: the object is first fixated, then its form, size, orientation, and position are determined, a grip is planned, and finally the object is grasped, moved to a new position, and released. As a final example for useful autonomous behavior we show how the calibration of the robot‘s image-to-world coordinate transform can be learned from experience, thus making detailed and unstable calibration of this important subsystem superfluous. The integration concepts developed at our institute have led to a flexible library of robot skills that can be easily recombined for a variety of useful behaviors.
Databáze: OpenAIRE