Angle estimation between plant parts for grasp optimisation in harvest robots

Research output: Contribution to journalArticleAcademicpeer-review

1 Citation (Scopus)

Abstract

For many robotic harvesting applications, position and angle between plant parts is required to optimally position the end-effector before attempting to approach, grasp and cut the product. A method for estimating the angle between plant parts, e.g. stem and fruit, is presented to support the optimisation of grasp pose for harvest robots. The hypothesis is that from colour images, this angle in the horizontal plane can be accurately derived under unmodified greenhouse conditions. It was hypothesised that the location of a fruit and stem could be inferred in the image plane from sparse semantic segmentations. The paper focussed on 4 sub-tasks for a sweet-pepper harvesting robot. Each task was evaluated under 3 conditions: laboratory, simplified greenhouse and unmodified greenhouse. The requirements for each task were based on the end-effector design that required a 25° positioning accuracy. In Task I, colour image segmentation for classes back-ground, fruit and stem plus wire was performed, meeting the requirement of an intersection-over-union > 0.58. In Task II, the stem pose was estimated from the segmentations. In Task III, centres of the fruit and stem were estimated from the output of previous tasks. Both centre estimations In Tasks II and III met the requirement of 25 pixel accuracy on average. In Task IV, the centres were used to estimate the angle between the fruit and stem, meeting the accuracy requirement of 25° for 73% of the cases. The work impacted on the harvest performance by increasing its success rate from 14% theoretically to 52% in practice under unmodified conditions.

LanguageEnglish
Pages26-46
Number of pages21
JournalBiosystems Engineering
Volume183
DOIs
Publication statusPublished - 1 Jul 2019

Fingerprint

robots
Hand Strength
Fruits
plant anatomy
Fruit
stem
Robots
Greenhouses
fruit
stems
fruits
segmentation
End effectors
greenhouses
Color
Capsicum
Robotics
Image segmentation
Semantics
color

Keywords

  • Agriculture
  • Angle estimation
  • Computer vision
  • Robotics
  • Semantic segmentation

Cite this

@article{9533646a03e845ef8093216c9e382ad2,
title = "Angle estimation between plant parts for grasp optimisation in harvest robots",
abstract = "For many robotic harvesting applications, position and angle between plant parts is required to optimally position the end-effector before attempting to approach, grasp and cut the product. A method for estimating the angle between plant parts, e.g. stem and fruit, is presented to support the optimisation of grasp pose for harvest robots. The hypothesis is that from colour images, this angle in the horizontal plane can be accurately derived under unmodified greenhouse conditions. It was hypothesised that the location of a fruit and stem could be inferred in the image plane from sparse semantic segmentations. The paper focussed on 4 sub-tasks for a sweet-pepper harvesting robot. Each task was evaluated under 3 conditions: laboratory, simplified greenhouse and unmodified greenhouse. The requirements for each task were based on the end-effector design that required a 25° positioning accuracy. In Task I, colour image segmentation for classes back-ground, fruit and stem plus wire was performed, meeting the requirement of an intersection-over-union > 0.58. In Task II, the stem pose was estimated from the segmentations. In Task III, centres of the fruit and stem were estimated from the output of previous tasks. Both centre estimations In Tasks II and III met the requirement of 25 pixel accuracy on average. In Task IV, the centres were used to estimate the angle between the fruit and stem, meeting the accuracy requirement of 25° for 73{\%} of the cases. The work impacted on the harvest performance by increasing its success rate from 14{\%} theoretically to 52{\%} in practice under unmodified conditions.",
keywords = "Agriculture, Angle estimation, Computer vision, Robotics, Semantic segmentation",
author = "Ruud Barth and Jochen Hemming and {Van Henten}, {Eldert J.}",
year = "2019",
month = "7",
day = "1",
doi = "10.1016/j.biosystemseng.2019.04.006",
language = "English",
volume = "183",
pages = "26--46",
journal = "Biosystems Engineering",
issn = "1537-5110",
publisher = "Elsevier",

}

Angle estimation between plant parts for grasp optimisation in harvest robots. / Barth, Ruud; Hemming, Jochen; Van Henten, Eldert J.

In: Biosystems Engineering, Vol. 183, 01.07.2019, p. 26-46.

Research output: Contribution to journalArticleAcademicpeer-review

TY - JOUR

T1 - Angle estimation between plant parts for grasp optimisation in harvest robots

AU - Barth, Ruud

AU - Hemming, Jochen

AU - Van Henten, Eldert J.

PY - 2019/7/1

Y1 - 2019/7/1

N2 - For many robotic harvesting applications, position and angle between plant parts is required to optimally position the end-effector before attempting to approach, grasp and cut the product. A method for estimating the angle between plant parts, e.g. stem and fruit, is presented to support the optimisation of grasp pose for harvest robots. The hypothesis is that from colour images, this angle in the horizontal plane can be accurately derived under unmodified greenhouse conditions. It was hypothesised that the location of a fruit and stem could be inferred in the image plane from sparse semantic segmentations. The paper focussed on 4 sub-tasks for a sweet-pepper harvesting robot. Each task was evaluated under 3 conditions: laboratory, simplified greenhouse and unmodified greenhouse. The requirements for each task were based on the end-effector design that required a 25° positioning accuracy. In Task I, colour image segmentation for classes back-ground, fruit and stem plus wire was performed, meeting the requirement of an intersection-over-union > 0.58. In Task II, the stem pose was estimated from the segmentations. In Task III, centres of the fruit and stem were estimated from the output of previous tasks. Both centre estimations In Tasks II and III met the requirement of 25 pixel accuracy on average. In Task IV, the centres were used to estimate the angle between the fruit and stem, meeting the accuracy requirement of 25° for 73% of the cases. The work impacted on the harvest performance by increasing its success rate from 14% theoretically to 52% in practice under unmodified conditions.

AB - For many robotic harvesting applications, position and angle between plant parts is required to optimally position the end-effector before attempting to approach, grasp and cut the product. A method for estimating the angle between plant parts, e.g. stem and fruit, is presented to support the optimisation of grasp pose for harvest robots. The hypothesis is that from colour images, this angle in the horizontal plane can be accurately derived under unmodified greenhouse conditions. It was hypothesised that the location of a fruit and stem could be inferred in the image plane from sparse semantic segmentations. The paper focussed on 4 sub-tasks for a sweet-pepper harvesting robot. Each task was evaluated under 3 conditions: laboratory, simplified greenhouse and unmodified greenhouse. The requirements for each task were based on the end-effector design that required a 25° positioning accuracy. In Task I, colour image segmentation for classes back-ground, fruit and stem plus wire was performed, meeting the requirement of an intersection-over-union > 0.58. In Task II, the stem pose was estimated from the segmentations. In Task III, centres of the fruit and stem were estimated from the output of previous tasks. Both centre estimations In Tasks II and III met the requirement of 25 pixel accuracy on average. In Task IV, the centres were used to estimate the angle between the fruit and stem, meeting the accuracy requirement of 25° for 73% of the cases. The work impacted on the harvest performance by increasing its success rate from 14% theoretically to 52% in practice under unmodified conditions.

KW - Agriculture

KW - Angle estimation

KW - Computer vision

KW - Robotics

KW - Semantic segmentation

U2 - 10.1016/j.biosystemseng.2019.04.006

DO - 10.1016/j.biosystemseng.2019.04.006

M3 - Article

VL - 183

SP - 26

EP - 46

JO - Biosystems Engineering

T2 - Biosystems Engineering

JF - Biosystems Engineering

SN - 1537-5110

ER -