TY - JOUR
T1 - VisGraB: A Benchmark for Vision-Based Grasping
AU - Kootstra, Gert
AU - Popović, Mila
AU - Jørgensen, Jimmy Alison
AU - Kragic, Danica
AU - Petersen, Henrik Gordon
AU - Krüger, Norbert
PY - 2012/5/17
Y1 - 2012/5/17
N2 - We present a database and a software tool, VisGraB, for benchmarking of methods for vision-based grasping of unknown objects with no prior object knowledge. The benchmark is a combined real-world and simulated experimental setup. Stereo images of real scenes containing several objects in different configurations are included in the database. The user needs to provide a method for grasp generation based on the real visual input. The grasps are then planned, executed, and evaluated by the provided grasp simulator where several grasp-quality measures are used for evaluation. This setup has the advantage that a large number of grasps can be executed and evaluated while dealing with dynamics and the noise and uncertainty present in the real world images. VisGraB enables a fair comparison among different grasping methods. The user furthermore does not need to deal with robot hardware, focusing on the vision methods instead. As a baseline, benchmark results of our grasp strategy are included.
AB - We present a database and a software tool, VisGraB, for benchmarking of methods for vision-based grasping of unknown objects with no prior object knowledge. The benchmark is a combined real-world and simulated experimental setup. Stereo images of real scenes containing several objects in different configurations are included in the database. The user needs to provide a method for grasp generation based on the real visual input. The grasps are then planned, executed, and evaluated by the provided grasp simulator where several grasp-quality measures are used for evaluation. This setup has the advantage that a large number of grasps can be executed and evaluated while dealing with dynamics and the noise and uncertainty present in the real world images. VisGraB enables a fair comparison among different grasping methods. The user furthermore does not need to deal with robot hardware, focusing on the vision methods instead. As a baseline, benchmark results of our grasp strategy are included.
U2 - 10.2478/s13230-012-0020-5
DO - 10.2478/s13230-012-0020-5
M3 - Article
SN - 2080-9778
VL - 3
JO - Paladyn, Journal of Behavioral Robotics
JF - Paladyn, Journal of Behavioral Robotics
IS - 2
M1 - 54-62
ER -