VisGraB: A Benchmark for Vision-Based Grasping

Gert Kootstra, Mila Popović, Jimmy Alison Jørgensen, Danica Kragic, Henrik Gordon Petersen, Norbert Krüger

Research output: Contribution to journalArticleAcademicpeer-review

23 Citations (Scopus)

Abstract

We present a database and a software tool, VisGraB, for benchmarking of methods for vision-based grasping of unknown objects with no prior object knowledge. The benchmark is a combined real-world and simulated experimental setup. Stereo images of real scenes containing several objects in different configurations are included in the database. The user needs to provide a method for grasp generation based on the real visual input. The grasps are then planned, executed, and evaluated by the provided grasp simulator where several grasp-quality measures are used for evaluation. This setup has the advantage that a large number of grasps can be executed and evaluated while dealing with dynamics and the noise and uncertainty present in the real world images. VisGraB enables a fair comparison among different grasping methods. The user furthermore does not need to deal with robot hardware, focusing on the vision methods instead. As a baseline, benchmark results of our grasp strategy are included.
Original languageEnglish
Article number54-62
JournalPaladyn, Journal of Behavioral Robotics
Volume3
Issue number2
DOIs
Publication statusPublished - 17 May 2012
Externally publishedYes

Fingerprint

Dive into the research topics of 'VisGraB: A Benchmark for Vision-Based Grasping'. Together they form a unique fingerprint.

Cite this