Mechanical damages of sugar beet during harvesting affects the quality of the final products and sugar yield. The mechanical damage of sugar beet is assessed randomly by operators of harvesters and can depend on the subjective opinion and experience of the operator due to the complexity of the harvester machines. Thus, the main aim of this study was to determine whether a digital two-dimensional imaging system coupled with convolutional neural network (CNN) techniques could be utilized to detect visible mechanical damage in sugar beet during harvesting in a harvester machine. In this research, various detector models based on the CNN, including You Only Look Once (YOLO) v4, region-based fully convolutional network (R-FCN) and faster regions with convolutional neural network features (Faster R-CNN) were developed. Sugar beet image data during harvesting from a harvester in different farming conditions were used for training and validation of the proposed models. The experimental results showed that the YOLO v4 CSPDarknet53 method was able to detect damage in sugar beet with better performance (recall, precision and F1-score of about 92, 94 and 93%, respectively) and higher speed (around 29 frames per second) compared to the other developed CNNs. By means of a CNN-based vision system, it was possible to automatically detect sugar beet damage within the sugar beet harvester machine.
The SmartBeet project aimed to develop a sensor system feasible to detect beet damages occurring in the harvester cleaning system. Sensor information should allow to design driver assistance systems safeguarding low-damage beets most suitable for long-term storage. Long-term storage trials in climate containers revealed that root tip breakage caused by turbine cleaning correlated sufficiently close with sugar losses, and thus can serve as an overall damage indicator. In a systematic drop test, heavier beets (>700 g), beets impacting the ground with the root tip ahead and dropping from 2.5 m caused largest tip breakage. Field experiments were conducted with measuring bobs which were shaped like beets and equipped with accelerometers and surface pressure sensors. They showed that type and form of impacts affect damage severity in addition to impact intensity. Moreover, the turbines exerted less impact compared to the lifter, sieve conveyor and auger conveyor. Results imply that the beet throughput level through the cleaning section significantly affects the occurrence of damages. In addition, the structure-borne sound of the beet guiding grates of the turbines was recorded. Single beet damage events were identified from videos taken by high speed cameras and synchronized with the associated sound frequency spectra. In future, time segments and synchronized Fast-Fourier-transformed frequency spectra will be used to derive specific trait variables in order to develop a Machine-Learning-Model.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.