Automated collection of facial temperatures in dairy cows via improved UNet

Hang Shu, Kaiwen Wang, Leifeng Guo, Jérôme Bindelle, Wensheng Wang*

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

In cattle, facial temperatures captured by infrared thermography provide useful information from physiological aspects for researchers and local practitioners. Traditional temperature collection requires massive manual operations on relevant software. Therefore, this paper aimed to propose a tool for automated temperature collection from cattle facial landmarks (i.e., eyes, muzzle, nostrils, ears, and horns). An improved UNet was designed by replacing the traditional convolutional layers in the decoder with Ghost modules and adding Efficient Channel Attention (ECA) modules. The improved model was trained on our open-source cattle infrared image dataset. The results show that Ghost modules reduced computational complexity and ECA modules further improved segmentation performance. The improved UNet outperformed other comparable models on the testing set, with the highest mean Intersection of Union of 80.76% and a slightly slower but still good inference speed of 32.7 frames per second. Further agreement analysis reveals small to negligible differences between the temperatures obtained automatically in the areas of eyes and ears and the ground truth. Collectively, this study demonstrates the capacity of the proposed method for automated facial temperature collection in cattle infrared images. Further modelling and correction with data collected in more complex conditions are required before it can be integrated into on-farm monitoring of animal health and welfare.
Original languageEnglish
Article number108614
JournalComputers and Electronics in Agriculture
Volume220
DOIs
Publication statusPublished - 1 May 2024

Fingerprint

Dive into the research topics of 'Automated collection of facial temperatures in dairy cows via improved UNet'. Together they form a unique fingerprint.

Cite this