School of Agriculture, Food and Ecosystem Sciences - Research Publications

Permanent URI for this collection

Search Results

Now showing 1 - 3 of 3
  • Item
    Thumbnail Image
    Livestock Identification Using Deep Learning for Traceability
    Dac, HH ; Gonzalez Viejo, C ; Lipovetzky, N ; Tongson, E ; Dunshea, FR ; Fuentes, S (MDPI, 2022-11)
    Farm livestock identification and welfare assessment using non-invasive digital technology have gained interest in agriculture in the last decade, especially for accurate traceability. This study aimed to develop a face recognition system for dairy farm cows using advanced deep-learning models and computer vision techniques. This approach is non-invasive and potentially applicable to other farm animals of importance for identification and welfare assessment. The video analysis pipeline follows standard human face recognition systems made of four significant steps: (i) face detection, (ii) face cropping, (iii) face encoding, and (iv) face lookup. Three deep learning (DL) models were used within the analysis pipeline: (i) face detector, (ii) landmark predictor, and (iii) face encoder. All DL models were finetuned through transfer learning on a dairy cow dataset collected from a robotic dairy farm located in the Dookie campus at The University of Melbourne, Australia. Results showed that the accuracy across videos from 89 different dairy cows achieved an overall accuracy of 84%. The computer program developed may be deployed on edge devices, and it was tested on NVIDIA Jetson Nano board with a camera stream. Furthermore, it could be integrated into welfare assessment previously developed by our research group.
  • Item
    Thumbnail Image
    Biometric Physiological Responses from Dairy Cows Measured by Visible Remote Sensing Are Good Predictors of Milk Productivity and Quality through Artificial Intelligence
    Fuentes, S ; Gonzalez Viejo, C ; Tongson, E ; Lipovetzky, N ; Dunshea, FR (MDPI, 2021-10)
    New and emerging technologies, especially those based on non-invasive video and thermal infrared cameras, can be readily tested on robotic milking facilities. In this research, implemented non-invasive computer vision methods to estimate cow's heart rate, respiration rate, and abrupt movements captured using RGB cameras and machine learning modelling to predict eye temperature, milk production and quality are presented. RGB and infrared thermal videos (IRTV) were acquired from cows using a robotic milking facility. Results from 102 different cows with replicates (n = 150) showed that an artificial neural network (ANN) model using only inputs from RGB cameras presented high accuracy (R = 0.96) in predicting eye temperature (°C), using IRTV as ground truth, daily milk productivity (kg-milk-day-1), cow milk productivity (kg-milk-cow-1), milk fat (%) and milk protein (%) with no signs of overfitting. The ANN model developed was deployed using an independent 132 cow samples obtained on different days, which also rendered high accuracy and was similar to the model development (R = 0.93). This model can be easily applied using affordable RGB camera systems to obtain all the proposed targets, including eye temperature, which can also be used to model animal welfare and biotic/abiotic stress. Furthermore, these models can be readily deployed in conventional dairy farms.
  • Item
    Thumbnail Image
    Assessment of Smoke Contamination in Grapevine Berries and Taint in Wines Due to Bushfires Using a Low-Cost E-Nose and an Artificial Intelligence Approach
    Fuentes, S ; Summerson, V ; Viejo, CG ; Tongson, E ; Lipovetzky, N ; Wilkinson, KL ; Szeto, C ; Unnithan, RR (MDPI AG, 2020-09-01)
    Bushfires are increasing in number and intensity due to climate change. A newly developed low-cost electronic nose (e-nose) was tested on wines made from grapevines exposed to smoke in field trials. E-nose readings were obtained from wines from five experimental treatments: (i) low-density smoke exposure (LS), (ii) high-density smoke exposure (HS), (iii) high-density smoke exposure with in-canopy misting (HSM), and two controls: (iv) control (C; no smoke treatment) and (v) control with in-canopy misting (CM; no smoke treatment). These e-nose readings were used as inputs for machine learning algorithms to obtain a classification model, with treatments as targets and seven neurons, with 97% accuracy in the classification of 300 samples into treatments as targets (Model 1). Models 2 to 4 used 10 neurons, with 20 glycoconjugates and 10 volatile phenols as targets, measured: in berries one hour after smoke (Model 2; R = 0.98; R2 = 0.95; b = 0.97); in berries at harvest (Model 3; R = 0.99; R2 = 0.97; b = 0.96); in wines (Model 4; R = 0.99; R2 = 0.98; b = 0.98). Model 5 was based on the intensity of 12 wine descriptors determined via a consumer sensory test (Model 5; R = 0.98; R2 = 0.96; b = 0.97). These models could be used by winemakers to assess near real-time smoke contamination levels and to implement amelioration strategies to minimize smoke taint in wines following bushfires