Title AI–driven multimodal sensing for early detection of health disorders in dairy cows
Authors Paulauskaite-Taraseviciene, Agne ; Nakrosis, Arnas ; Zymantiene, Judita ; Jurenas, Vytautas ; Vezys, Joris ; Sederevicius, Antanas ; Gruzauskas, Romas ; Oberauskas, Vaidas ; Japertiene, Renata ; Bubulis, Algimantas ; Kizauskiene, Laura ; Silinskas, Ignas ; Zemaitis, Juozas ; Ostasevicius, Vytautas
DOI 10.3390/ani16030411
Full Text Download
Is Part of Animals.. Basel : MDPI. 2026, vol. 16, iss. 3, art. no. 411, p. 1-28.. ISSN 2076-2615
Keywords [eng] dairy cow ; mastitis ; thermal imaging ; artificial intelligence ; computer vision ; early prediction
Abstract [eng] Digital technologies that continuously quantify animal behavior, physiology, and production offer significant potential for the early identification of health and welfare disorders of dairy cows. In this study, a multimodal artificial intelligence (AI) framework is proposed for real-time health monitoring of dairy cows through the integration of physiological, behavioral, production, and thermal imaging data, targeting veterinarian-confirmed udder, leg, and hoof infections. Predictions are generated at the cow-day level by aggregating multimodal measurements collected during daily milking events. The dataset comprised 88 lactating cows, including veterinarian-confirmed udder, leg, and hoof infections grouped under a single ‘sick’ label. To prevent information leakage, model evaluation was performed using a cow-level data split, ensuring that data from the same animal did not appear in both training and testing sets. The system is designed to detect early deviations from normal health trajectories prior to the appearance of overt clinical symptoms. All measurements, with the exception of the intra-ruminal bolus sensor, were obtained non-invasively within a commercial dairy farm equipped with automated milking and monitoring infrastructure. A key novelty of this work is the simultaneous integration of data from three independent sources: an automated milking system, a thermal imaging camera, and an intra-ruminal bolus sensor. A hybrid deep learning architecture is introduced that combines the core components of established models, including U-Net, O-Net, and ResNet, to exploit their complementary strengths for the analysis of dairy cow health states. The proposed multimodal approach achieved an overall accuracy of 91.62% and an AUC of 0.94 and improved classification performance by up to 3% compared with single-modality models, demonstrating enhanced robustness and sensitivity to early-stage disease.
Published Basel : MDPI
Type Journal article
Language English
Publication date 2026
CC license CC license description