2021
DOI: 10.1007/978-3-030-92659-5_38
|View full text |Cite
|
Sign up to set email alerts
|

Weakly Supervised Segmentation Pretraining for Plant Cover Prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
1
1
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 48 publications
0
2
0
Order By: Relevance
“…Data annotation by an expert with domainspecific knowledge is a tedious and expensive task. The DL community is exploring various strategies to break this dependency on a large quantity of annotated data to train DL models in a label-efficient manner, including approaches like active learning (Nagasubramanian et al, 2021), transfer learning (Jiang and Li, 2020), weakly supervised learning (Ghosal et al, 2019;Körschens et al, 2021) and the more recent advances in selfsupervised learning (Jing and Tian, 2020;Marin Zapata et al, 2021;Nagasubramanian et al, 2022). Transfer learning has been widely utilized in plant phenomics applications for classification and segmentation tasks (Wang et al, 2019;Kattenborn et al, 2021).…”
Section: Introductionmentioning
confidence: 99%
“…Data annotation by an expert with domainspecific knowledge is a tedious and expensive task. The DL community is exploring various strategies to break this dependency on a large quantity of annotated data to train DL models in a label-efficient manner, including approaches like active learning (Nagasubramanian et al, 2021), transfer learning (Jiang and Li, 2020), weakly supervised learning (Ghosal et al, 2019;Körschens et al, 2021) and the more recent advances in selfsupervised learning (Jing and Tian, 2020;Marin Zapata et al, 2021;Nagasubramanian et al, 2022). Transfer learning has been widely utilized in plant phenomics applications for classification and segmentation tasks (Wang et al, 2019;Kattenborn et al, 2021).…”
Section: Introductionmentioning
confidence: 99%
“…Data annotation by an expert with domain-specific knowledge is a tedious and expensive task. The DL community is exploring various strategies to break this dependency on a large quantity of annotated data to train DL models in a label-efficient manner, including approaches like active learning [10], transfer learning [11], weakly supervised learning [12, 13] and the more recent advances in self-supervised learning [14, 15]. In this work, we focus on deploying self-supervised learning approaches to the problem of characterizing maize kernels that are imaged in a commercial high-throughput seed imaging system (Qsorter technologies [16]).…”
Section: Introductionmentioning
confidence: 99%