Learning task constraints for robot grasping using graphical models
Autor: | Danica Kragic, Kai Huebner, Ville Kyrki, Dan Song |
---|---|
Rok vydání: | 2010 |
Předmět: |
0209 industrial biotechnology
Constraint learning Computer science business.industry Test data generation media_common.quotation_subject GRASP Inference 02 engineering and technology Machine learning computer.software_genre Task (project management) Constraint (information theory) 020901 industrial engineering & automation 0202 electrical engineering electronic engineering information engineering Task analysis Robot 020201 artificial intelligence & image processing Artificial intelligence Graphical model Imitation business computer media_common |
Zdroj: | IROS |
DOI: | 10.1109/iros.2010.5649406 |
Popis: | This paper studies the learning of task constraints that allow grasp generation in a goal-directed manner. We show how an object representation and a grasp generated on it can be integrated with the task requirements. The scientific problems tackled are (i) identification and modeling of such task constraints, and (ii) integration between a semantically expressed goal of a task and quantitative constraint functions defined in the continuous object-action domains. We first define constraint functions given a set of object and action attributes, and then model the relationships between object, action, constraint features and the task using Bayesian networks. The probabilistic framework deals with uncertainty, combines a-priori knowledge with observed data, and allows inference on target attributes given only partial observations. We present a system designed to structure data generation and constraint learning processes that is applicable to new tasks, embodiments and sensory data. The application of the task constraint model is demonstrated in a goal-directed imitation experiment. |
Databáze: | OpenAIRE |
Externí odkaz: |