2022
DOI: 10.3390/rs14246252
|View full text |Cite
|
Sign up to set email alerts
|

FlowerPhenoNet: Automated Flower Detection from Multi-View Image Sequences Using Deep Neural Networks for Temporal Plant Phenotyping Analysis

Abstract: A phenotype is the composite of an observable expression of a genome for traits in a given environment. The trajectories of phenotypes computed from an image sequence and timing of important events in a plant’s life cycle can be viewed as temporal phenotypes and indicative of the plant’s growth pattern and vigor. In this paper, we introduce a novel method called FlowerPhenoNet, which uses deep neural networks for detecting flowers from multiview image sequences for high-throughput temporal plant phenotyping an… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 28 publications
0
3
0
Order By: Relevance
“…It has three watering stations with a balance that can add water to a target weight or specific volume and records the specific quantity of water added daily. The images of the greenhouse with plants placed on the automated conveyor belt, the watering station, and plants entering into the imaging cabinets are available in ( Das Choudhury et al., 2018 ; Das Choudhury et al., 2022 ). The cameras installed in the four imaging chambers are (a) visible light - side view and top view (Prosilica GT6600 29 megapixel camera with a Gigabit Ethernet interface 1 ), (b) infrared - side view and top view (Pearleye p-030 LWIR), (c) fluorescent - side view and top view (Basler Scout scA1400-17gm/gc), and (d) hyperspectral - side view (Headwall Hyperspec Inspector x-vnir 2 ) and near-infrared - top view (Goldeye p-008 SWIR), respectively.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…It has three watering stations with a balance that can add water to a target weight or specific volume and records the specific quantity of water added daily. The images of the greenhouse with plants placed on the automated conveyor belt, the watering station, and plants entering into the imaging cabinets are available in ( Das Choudhury et al., 2018 ; Das Choudhury et al., 2022 ). The cameras installed in the four imaging chambers are (a) visible light - side view and top view (Prosilica GT6600 29 megapixel camera with a Gigabit Ethernet interface 1 ), (b) infrared - side view and top view (Pearleye p-030 LWIR), (c) fluorescent - side view and top view (Basler Scout scA1400-17gm/gc), and (d) hyperspectral - side view (Headwall Hyperspec Inspector x-vnir 2 ) and near-infrared - top view (Goldeye p-008 SWIR), respectively.…”
Section: Methodsmentioning
confidence: 99%
“…Deep neural networks have been successfully employed in high throughput temporal plant phenotyping for a variety of applications ( Bashyam et al., 2021 ; Zheng et al., 2021 ; Das Choudhury et al., 2022 ). The method in ( Das Choudhury et al., 2022 ) performs automated flower detection from multi-view image sequences to determine a set of novel phenotypes, e.g., the emergence time of the first flower, the total number of flowers present in the plant at a given time, flower growth trajectory, and blooming trajectory. A graph theoretic approach has been used by ( Bashyam et al., 2021 ) to detect and track individual leaves of a maize plant for automated growth stage monitoring.…”
Section: Introductionmentioning
confidence: 99%
“…Although increasing interest exists in predicting yield through flowering time and bloom (Das Choudhury et al., 2022; Han et al., 2021; Zhang et al., 2020), floral traits are rarely analyzed in these digital‐based platforms because of difficulties in procuring high spatial–temporal resolution for flowers (McCabe & Tester, 2021; Mochida et al., 2020). Drone‐borne imaging systems assisted by machine learning are being developed to increase accurate flower detection in field crops aiming at predicting yield through flowering time and/or intensity, but it requires continuous empirical algorithmic development for each species or variety studied (e.g., specific flower color and shape, variable baseline noise) and has, to date, mainly been applied to populations of relatively low genetic variability (Pieruschka & Schurr, 2019).…”
Section: Source–sink Relationships and The Limitations Of Plant‐pheno...mentioning
confidence: 99%