In this paper the feasibility to extract the proportion of pigs located in different areas of a pig pen by advanced image analysis technique is explored and discussed for possible applications. For example, pigs generally locate themselves in the wet dunging area at high ambient temperatures in order to avoid heat stress, as wetting the body surface is the major path to dissipate the heat by evaporation. Thus, the portion of pigs in the dunging area and resting area, respectively, could be used as an indicator of failure of controlling the climate in the pig environment as pigs are not supposed to rest in the dunging area. The computer vision methodology utilizes a learning based segmentation approach using several features extracted from the image. The learning based approach applied is based on extended state-of-the-art features in combination with a structured prediction framework based on a logistic regression solver using elastic net regularization. In addition, the method is able to produce a probability per pixel rather than form a hard decision. This overcomes some of the limitations found in a setup using grey-scale information only. The pig pen is a difficult imaging environment because of challenging lighting conditions like shadows, poor lighting and poor contrast between pig and background. In order to test practical conditions, a pen containing nine young pigs was filmed from a top view perspective by an Axis M3006 camera with a resolution of 640 × 480 in three, 10-min sessions under different lighting conditions. The results indicate that a learning based method improves, in comparison with greyscale methods, the possibility to reliable identify proportions of pigs in different areas of the pen. Pigs with a changed behaviour (location) in the pen may indicate changed climate conditions. Changed individual behaviour may also indicate inferior health or acute illness.
Damaging behaviors, like feather pecking (FP), have large economic and welfare consequences in the commercial laying hen industry. Selective breeding can be used to obtain animals that are less likely to perform damaging behavior on their pen-mates. However, with the growing tendency to keep birds in large groups, identifying specific birds that are performing or receiving FP is difficult. With current developments in sensor technologies, it may now be possible to identify laying hens in large groups that show less FP behavior and select them for breeding. We propose using a combination of sensor technology and genomic methods to identify feather peckers and victims in groups. In this review, we will describe the use of “-omics” approaches to understand FP and give an overview of sensor technologies that can be used for animal monitoring, such as ultra-wideband, radio frequency identification, and computer vision. We will then discuss the identification of indicator traits from both sensor technologies and genomics approaches that can be used to select animals for breeding against damaging behavior.
To maintain dairy cattle health and welfare at commensurable levels, analysis of the behaviors occurring between cows should be performed. This type of behavioral analysis is highly dependent on reliable and robust tracking of individuals, for it to be viable and applicable on-site. In this article, we introduce a novel method for continuous tracking and data-marker based identification of individual cows based on convolutional neural networks (CNNs). The methodology for data acquisition and overall implementation of tracking/identification is described. The Region of Interest (ROI) for the recordings was limited to a waiting area with free entrances to four automatic milking stations and a total size of 6 × 18 meters. There were 252 Swedish Holstein cows during the time of study that had access to the waiting area at a conventional dairy barn with varying conditions and illumination. Three Axis M3006-V cameras placed in the ceiling at 3.6 meters height and providing top-down view were used for recordings. The total amount of video data collected was 4 months, containing 500 million frames. To evaluate the system two 1-h recordings were chosen. The exit time and gate-id found by the tracker for each cow were compared with the exit times produced by the gates. In total there were 26 tracks considered, and 23 were correctly tracked. Given those 26 starting points, the tracker was able to maintain the correct position in a total of 101.29 min or 225 s in average per starting point/individual cow. Experiments indicate that a cow could be tracked close to 4 min before failure cases emerge and that cows could be successfully tracked for over 20 min in mildly-crowded (<10 cows) scenes. The proposed system is a crucial stepping stone toward a fully automated tool for continuous monitoring of cows and their interactions with other individuals and the farm-building environment.
In the field of applied animal behaviour, video recordings of a scene of interest are often made and then evaluated by experts. This evaluation is based on different criteria (number of animals present, an occurrence of certain interactions, the proximity between animals and so forth) and aims to filter out video sequences that contain irrelevant information. However, such task requires a tremendous amount of time and resources, making manual approach ineffective. To reduce the amount of time the experts spend on watching the uninteresting video, this study introduces an automated watchdog system that can discard some of the recorded video material based on user-defined criteria. A pilot study on cows was made where a convolutional neural network detector was used to detect and count the number of cows in the scene as well as include distances and interactions between cows as filtering criteria. This approach removed 38% (50% for additional filter parameters) of the recordings while only losing 1% (4%) of the potentially interesting video frames.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.