Manual observation and classification of animal behaviors is laborious, time-consuming, and of limited ability to process large amount of data. A computer vision-based system was developed that automatically recognizes sow behaviors (lying, sitting, standing, kneeling, feeding, drinking, and shifting) in farrowing crate. The system consisted of a low-cost 3D camera that simultaneously acquires digital and depth images and a software program that detects and identifies the sow's behaviors. This paper describes the computational algorithm for the analysis of depth images and presents its performance in recognizing the sow's behaviors as compared to manual recognition. The images were acquired at 6 s intervals on three days of a 21-day lactation period. Based on analysis of the 6 s interval images, the algorithm had the following accuracy of behavioral classification: 99.9% in lying, 96.4% in sitting, 99.2% in standing, 78.1% in kneeling, 97.4% in feeding, 92.7% in drinking, and 63.9% in transitioning between behaviors. The lower classification accuracy for the transitioning category presumably stemmed from insufficient frequency of the image acquisition which can be readily improved. Hence the reported system provides an effective way to automatically process and classify the sow's behavioral images. This tool is conducive to investigating behavioral responses and time budget of lactating sows and their litters to farrowing crate designs and management practices. RightsWorks produced by employees of the U.S. Government as part of their official duties are not copyrighted within the U.S. The content of this document is not copyrighted. b s t r a c tManual observation and classification of animal behaviors is laborious, time-consuming, and of limited ability to process large amount of data. A computer vision-based system was developed that automatically recognizes sow behaviors (lying, sitting, standing, kneeling, feeding, drinking, and shifting) in farrowing crate. The system consisted of a low-cost 3D camera that simultaneously acquires digital and depth images and a software program that detects and identifies the sow's behaviors. This paper describes the computational algorithm for the analysis of depth images and presents its performance in recognizing the sow's behaviors as compared to manual recognition. The images were acquired at 6 s intervals on three days of a 21-day lactation period. Based on analysis of the 6 s interval images, the algorithm had the following accuracy of behavioral classification: 99.9% in lying, 96.4% in sitting, 99.2% in standing, 78.1% in kneeling, 97.4% in feeding, 92.7% in drinking, and 63.9% in transitioning between behaviors. The lower classification accuracy for the transitioning category presumably stemmed from insufficient frequency of the image acquisition which can be readily improved. Hence the reported system provides an effective way to automatically process and classify the sow's behavioral images. This tool is conducive to investigating behavioral responses an...
Heat stress is one of the most important environmental stressors facing poultry production and welfare worldwide. The detrimental effects of heat stress on poultry range from reduced growth and egg production to impaired health. Animal vocalisations are associated with different animal responses and can be used as useful indicators of the state of animal welfare. It is already known that specific chicken vocalisations such as alarm, squawk, and gakel calls are correlated with stressful events, and therefore, could be used as stress indicators in poultry monitoring systems. In this study, we focused on developing a hen vocalisation detection method based on machine learning to assess their thermal comfort condition. For extraction of the vocalisations, nine source-filter theory related temporal and spectral features were chosen, and a support vector machine (SVM) based classifier was developed. As a result, the classification performance of the optimal SVM model was 95.1 ± 4.3% (the sensitivity parameter) and 97.6 ± 1.9% (the precision parameter). Based on the developed algorithm, the study illustrated that a significant correlation existed between specific vocalisations (alarm and squawk call) and thermal comfort indices (temperature-humidity index, THI) (alarm-THI, R = −0.414, P = 0.01; squawk-THI, R = 0.594, P = 0.01). This work represents the first step towards the further development of technology to monitor flock vocalisations with the intent of providing producers an additional tool to help them actively manage the welfare of their flock.
Due to the increasing scale of farms, it is increasingly difficult for farmers to monitor their animals in an automated way. Because of this problem, we focused on a sound technique to monitor laying hens. Sound analysis has become an important tool for studying the behaviour, health and welfare of animals in recent years. A surveillance system using microphone arrays of Kinects was developed for automatically monitoring birds’ abnormal vocalisations during the night. Based on the principle of time-difference of arrival (TDOA) of sound source localisation (SSL) method, Kinect sensor direction estimations were very accurate. The system had an accuracy of 74.7% in laboratory tests and 73.6% in small poultry group tests for different area sound recognition. Additionally, flocks produced an average of 40 sounds per bird during feeding time in small group tests. It was found that, on average, each normal chicken produced more than 53 sounds during the daytime (noon to 6:00 p.m.) and less than one sound at night (11:00 p.m.–3:00 a.m.). This system can be used to detect anomalous poultry status at night by monitoring the number of vocalisations and area distributions, which provides a practical and feasible method for the study of animal behaviour and welfare.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.