2021
DOI: 10.1109/access.2021.3075293
|View full text |Cite
|
Sign up to set email alerts
|

Hierarchical Classification of Very Small Objects: Application to the Detection of Arthropod Species

Abstract: We would like to thank the following organisations for their support, ANR-16-CONV-0004 #DigitAg as well as CIRAD DPP COSAQ agronomical research programme (activities 2015-2021) funded by a grant from the European Community (ERDF) and the Conseil Régional de La Réunion.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 19 publications
0
6
0
Order By: Relevance
“…However, those relationships are hierarchical in nature, and hierarchical classification has been researched in different application domains (Silla and Freitas, 2011;Salakhutdinov et al, 2013;Park and Kim, 2020) such as diatom images (Dimitrovski et al, 2012), disease detection (An et al, 2021) and protein families (Sandaruwan and Wannige, 2021). For animals such as arthropods (Tresson et al, 2021) and fish species (Gupta et al, 2022) hierarchical classification has been investigated with the object detector YOLOv3 (Redmon and Farhadi, 2018) designed to detect and classify species using a "flat" multi-class structure. Tresson et al (2021) used a two-step approach with a YOLOv3 object detector to detect and classify arthropods in five super-classes, using a separate model to classify cropped images at the species level.…”
Section: Background and Related Workmentioning
confidence: 99%
“…However, those relationships are hierarchical in nature, and hierarchical classification has been researched in different application domains (Silla and Freitas, 2011;Salakhutdinov et al, 2013;Park and Kim, 2020) such as diatom images (Dimitrovski et al, 2012), disease detection (An et al, 2021) and protein families (Sandaruwan and Wannige, 2021). For animals such as arthropods (Tresson et al, 2021) and fish species (Gupta et al, 2022) hierarchical classification has been investigated with the object detector YOLOv3 (Redmon and Farhadi, 2018) designed to detect and classify species using a "flat" multi-class structure. Tresson et al (2021) used a two-step approach with a YOLOv3 object detector to detect and classify arthropods in five super-classes, using a separate model to classify cropped images at the species level.…”
Section: Background and Related Workmentioning
confidence: 99%
“…Drastic changes in arthropod population abundance and diversity have negative cascading effects on ecological stability and ecosystem resiliency (Borer et al, 2012; Kremen et al, 1993; Tscharntke et al, 2012). To expedite and improve the analysis of these trends, the ecological field is currently developing deep learning methods to better understand this potential threat of food web collapse (Arje, Melvad, et al, 2020; Helton et al, 2022; Schneider et al, 2022; Tresson et al, 2021; Wani & Maul, 2021).…”
Section: Introductionmentioning
confidence: 99%
“…Deep learning in the field of entomology continually makes strides to accomplish tasks that previously required human experts (Hansen et al, 2020;Xin et al, 2020). This is particularly true for classification and detection (Ramcharan et al, 2017;Tresson et al, 2021) where deep learning models have, in recent years, standardized around specific vision architectures (ResNet, DenseNet, Vision Transformer, etc.) (Dosovitskiy et al, 2020;Gao et al, 2018;Szegedy et al, 2017).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Drastic changes in arthropod population abundance and diversity have negative cascading effects on ecological stability and ecosystem resiliency [8,9,10]. To expedite and improve the analysis of these trends, the ecological field is currently developing deep learning methods to better understand this potential threat of food web collapse [11,12,13,14,15].…”
Section: Introductionmentioning
confidence: 99%