2020
DOI: 10.3389/fpls.2020.00617
|View full text |Cite
|
Sign up to set email alerts
|

Rapeseed Stand Count Estimation at Leaf Development Stages With UAV Imagery and Convolutional Neural Networks

Abstract: Rapeseed is an important oil crop in China. Timely estimation of rapeseed stand count at early growth stages provides useful information for precision fertilization, irrigation, and yield prediction. Based on the nature of rapeseed, the number of tillering leaves is strongly related to its growth stages. However, no field study has been reported on estimating rapeseed stand count by the number of leaves recognized with convolutional neural networks (CNNs) in unmanned aerial vehicle (UAV) imagery. The objective… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
19
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 26 publications
(20 citation statements)
references
References 66 publications
0
19
1
Order By: Relevance
“…Similarly, the validation was carried out by comparing the values predicted by the models with the actual values obtained from the phenotypic and molecular data. To compare the performance of the models in the prediction of hybrid performance, the statistical criteria such as MAE, RMSE and R 2 were used in this research [23]. In order to create and train the neural network in the programming environment, MATLAB 2018b software was used.…”
Section: Models Performance Evaluationmentioning
confidence: 99%
“…Similarly, the validation was carried out by comparing the values predicted by the models with the actual values obtained from the phenotypic and molecular data. To compare the performance of the models in the prediction of hybrid performance, the statistical criteria such as MAE, RMSE and R 2 were used in this research [23]. In order to create and train the neural network in the programming environment, MATLAB 2018b software was used.…”
Section: Models Performance Evaluationmentioning
confidence: 99%
“…UAV equipped with RGB cameras (UAV-RGB) has the advantages of higher resolution and less weather affection, which makes it possible to acquire and process large-scale field information conveniently. Recently, combined with the deep learning technology, UAV-RGB system extracted purple leaves ( Zhang et al., 2020a ), recognized frozen ( Li et al., 2022 ), and estimated stand count ( Zhang et al., 2020b ) of rapeseed effectively. Many experts are also attracted to study crop counting.…”
Section: Introductionmentioning
confidence: 99%
“…UAVs is one of the most popular HTPPs used to collect and track various traits of crops because of their ease of deployment, low cost, nondestructive, and noninvasive advantages (Sankaran et al., 2015; Shi et al., 2016; Yang et al., 2017). Many phenotyping studies with UAV platforms have been published in rice (J. Wang et al., 2021; J. Wu et al., 2019), wheat (Holman et al., 2016; Li et al., 2019), maize (Buchaillot et al., 2019; Su et al., 2019), soybean (Borra‐Serrano et al., 2020; Trevisan et al., 2020), and other plants (Shafian et al., 2018; J. Zhang et al., 2020). Various phenotypes can be sensed at the canopy level by UAVs including plant height, canopy cover, and spectral reflectance (Galli et al., 2020; Kamilaris & Prenafeta‐Boldú, 2018; Sankaran et al., 2015; Volpato et al., 2021).…”
Section: Introductionmentioning
confidence: 99%
“…Plant height and leaf rotation angle are derived from high‐resolution red‐green‐blue (RGB) images (Holman et al., 2016; Kawamura et al., 2020; Xu et al., 2021). In addition, high‐resolution RGB cameras can monitor plant growth in seedling, heading, and maturity stages (Das et al., 2019; Madec et al., 2019; J. Zhang et al., 2020). Hence, aerial images captured by UAVs could be a potential data source showing the details of proso millet panicles and tillers for panicle detection and heading percentage estimation in proso millet breeding.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation