Automatic Acquisition of a Repertoire of Diverse Grasping Trajectories through Behavior Shaping and Novelty Search
Abstract
Grasping a particular object may require a dedicated grasping movement that may also be specific to the robot end-effector. No generic and autonomous method does exist to generate these movements without making hypotheses on the robot or on the object. Learning methods could help to autonomously discover relevant grasping movements, but they face an important issue: grasping movements are so rare that a learning method based on exploration has little chance to ever observe an interesting movement, thus creating a bootstrap issue. We introduce an approach to generate diverse grasping movements in order to solve this problem. The movements are generated in simulation, for particular object positions. We test it on several simulated robots: Baxter, Pepper and a Kuka Iiwa arm. Although we show that generated movements actually work on a real Baxter robot, the aim is to use this method to create a large dataset to bootstrap deep learning methods.
Fichier principal
root.pdf (3.27 Mo)
Télécharger le fichier
images/reality_sample_efficiency-big.svg (18.49 Ko)
Télécharger le fichier
images/reality_success_rate-big.svg (17.32 Ko)
Télécharger le fichier
images/robot_diversity_coverage-big.svg (54.39 Ko)
Télécharger le fichier
Origin | Files produced by the author(s) |
---|