2022
DOI: 10.1016/j.compag.2021.106586
|View full text |Cite|
|
Sign up to set email alerts
|

Real-time strawberry detection using deep neural networks on embedded system (rtsd-net): An edge AI application

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
24
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 86 publications
(39 citation statements)
references
References 10 publications
0
24
0
Order By: Relevance
“…Chen et al [22] added the identification of strawberry flowers to the category of detected ripe and immature strawberries and estimated their yield based on the FAST-RCNN algorithm whose average accuracy of detection was 72%. Li et al [23]used a combination of depth features and classifiers for strawberry appearance recognition, which was mainly applied to mechanized production lines, and the accuracy of the method was as high as 96.55%, but its single background was not suitable for detection under natural growth environment.ZHANG [24] designed a lightweight real-time strawberry recognition device based on the YOLOv4-tiny algorithm, which is mainly applied to ripe strawberry harvest detection.…”
Section: Introductionmentioning
confidence: 99%
“…Chen et al [22] added the identification of strawberry flowers to the category of detected ripe and immature strawberries and estimated their yield based on the FAST-RCNN algorithm whose average accuracy of detection was 72%. Li et al [23]used a combination of depth features and classifiers for strawberry appearance recognition, which was mainly applied to mechanized production lines, and the accuracy of the method was as high as 96.55%, but its single background was not suitable for detection under natural growth environment.ZHANG [24] designed a lightweight real-time strawberry recognition device based on the YOLOv4-tiny algorithm, which is mainly applied to ripe strawberry harvest detection.…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, it is necessary to optimize and improve the existing network model to meet the detection needs of this research. Currently, object detection technology has been used in strawberries [30][31][32], grapes [33], apple fruits [34,35], flowers [36], maize [25], and rice [37,38], and has achieved relatively good application results. There is scientific evidence about the accuracy of the proposed architecture described in the studies by Li et al [39], in which YOLO-JD has achieved the best detection accuracy, with an average mAP of 96.63%.…”
Section: Related Workmentioning
confidence: 99%
“…(2017), has been utilised in flower detection studies, as flowers are most visible from above the canopy. A consideration when using CNNs for high‐throughput phenotyping is how to balance the trade‐off between accuracy (for example, using Faster R‐CNN type networks (Chen et al., 2019; Lin & Chen, 2018; Zhou et al., 2020) and efficiency (such as the YOLO family of architectures (Fan et al., 2022; Kim et al., 2020; Zhang et al., 2022)) if real‐time performance is of relevance in the system. The automation of fruit and flower counts is beneficial to the selection process as the productivity of genotypes can be assessed, allowing breeders to immediately disregard those that do not meet the required level and thus increasing the selection process efficiency.…”
Section: Automation Of Morphological Traits Currently Used In Breedingmentioning
confidence: 99%
“…As fruit and flower counts are also vital for yield forecasting applications, the precise detection and counting of these components have been a research focus for high‐throughput methods, with around half of the literature pertaining to automation of phenotypic traits centring around this. In image‐based phenotyping, CNNs have been used to address fruit (Chen et al., 2019; Fan et al., 2022; Ilyas et al., 2021; Kerfs et al., 2017; Kim et al., 2020; Kirk et al., 2020; Lamb & Chuah, 2018; Yu et al., 2019; Zhang et al., 2022; Zhou et al., 2020) and flower (Heylen et al., 2021; Lin & Chen, 2018) detection in real‐world agricultural conditions, as they offer greater robustness to the varying environmental conditions experienced than traditional machine learning methods that use manually defined features. However, a challenge for detection in real‐world environments is occlusion, which can be minimised by selecting an appropriate viewpoint from which to collect data so as to maximise the prominence of the organ of interest in the image.…”
Section: Automation Of Morphological Traits Currently Used In Breedingmentioning
confidence: 99%