Abstrakt: |
We present an example-based planning framework to generate semantic grasps, stable grasps that are functionally suitable for specific object manipulation tasks. We propose to use partial object geometry, tactile contacts, and hand kinematic data as proxies to encode task-related constraints, which we call semantic constraints. We introduce a semantic affordance map, which relates local geometry to a set of predefined semantic grasps that are appropriate to different tasks. Using this map, the pose of a robot hand with respect to the object can be estimated so that the hand is adjusted to achieve the ideal approach direction required by a particular task. A grasp planner is then used to search along this approach direction and generate a set of final grasps which have appropriate stability, tactile contacts, and hand kinematics. We show experiments planning semantic grasps on everyday objects and applying these grasps with a physical robot. [ABSTRACT FROM AUTHOR] |