Loading…
Semantic grasping: planning task-specific stable robotic grasps
We present an example-based planning framework to generate semantic grasps , stable grasps that are functionally suitable for specific object manipulation tasks. We propose to use partial object geometry, tactile contacts, and hand kinematic data as proxies to encode task-related constraints, which...
Saved in:
Published in: | Autonomous robots 2014-10, Vol.37 (3), p.301-316 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | We present an example-based planning framework to generate
semantic grasps
, stable grasps that are functionally suitable for specific object manipulation tasks. We propose to use partial object geometry, tactile contacts, and hand kinematic data as proxies to encode task-related constraints, which we call
semantic constraints
. We introduce a
semantic affordance map
, which relates local geometry to a set of predefined semantic grasps that are appropriate to different tasks. Using this map, the pose of a robot hand with respect to the object can be estimated so that the hand is adjusted to achieve the ideal approach direction required by a particular task. A grasp planner is then used to search along this approach direction and generate a set of final grasps which have appropriate stability, tactile contacts, and hand kinematics. We show experiments planning semantic grasps on everyday objects and applying these grasps with a physical robot. |
---|---|
ISSN: | 0929-5593 1573-7527 |
DOI: | 10.1007/s10514-014-9391-2 |