Sensor system red-currant pruning robot: report on performance of sensors and algorithms

Jochen Hemming, Manya Afonso, Chrysanthos Papadakis, Naftali Slob, Bas Boom

Research output: Book/ReportReportProfessional

Abstract

This study explores the development of a mobile sensor system for a red currant pruning robot. The project evaluated two sensing approaches: sensors on a base platform, including a LiDAR and cameras mounted on a moving trolley for full-row scanning, and an end-effector stereo camera on the robotic arm for real-time image processing. A physical twin of red currant plants was constructed for controlled indoor testing. Manual image annotation was enhanced using a Virtual Reality tool. Deep learning algorithms were employed for segmentation and classification. OneFormer3D was used for instance segmentation of individual plants and recognition of the object in the orchard, the object recognition works with 85% object recognition rate, the instance recognition of a plant obtained 60% average precision. MaskRCNN was used to detect and classify 1-year and 2-year-old branches. The ResNet-based classification reached an accuracy of 83%. The study highlights the need for a balance between sensor cost, accuracy, and real-time processing.
Original languageEnglish
Place of PublicationWageningen
PublisherWageningen Plant Research
Number of pages32
DOIs
Publication statusPublished - 2025

Publication series

NameRapport / Stichting Wageningen Research, Wageningen Plant Research, Businessunit Glastuinbouw
No.WPR-1411

Fingerprint

Dive into the research topics of 'Sensor system red-currant pruning robot: report on performance of sensors and algorithms'. Together they form a unique fingerprint.

Cite this