2020
DOI: 10.1093/aobpla/plaa052
|View full text |Cite
|
Sign up to set email alerts
|

What plant is that? Tests of automated image recognition apps for plant identification on plants from the British flora

Abstract: There has been a recent explosion in development of image recognition technology and its application to automated plant identification so it is timely to consider its potential for field botany. Nine free apps or websites for automated plant identification and suitable for use on mobile phones or tablet computers in the field were tested on a disparate set of 38 images of plants or parts of plants chosen from the higher plant flora of Britain and Ireland. There were large differences in performance with the be… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

3
50
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 41 publications
(54 citation statements)
references
References 19 publications
3
50
1
Order By: Relevance
“…Results of the four referenced comparative studies show that Flora Incognita achieves state of the art identification accuracy. We attribute Flora Incognita's varying performance mainly to the studies' experimental protocol, for example, solely using single image observations (Jones, 2020; Shapovalov et al., 2020), unavailable or wrong geolocation preventing habitat analysis (Jones, 2020) and identifying non‐supported taxa (Jones, 2020; Schmidt & Steinecke, 2019). However, Jones (2020) still concludes that the Flora Incognita app is a very valuable tool even for botanists and ecologists during field studies.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…Results of the four referenced comparative studies show that Flora Incognita achieves state of the art identification accuracy. We attribute Flora Incognita's varying performance mainly to the studies' experimental protocol, for example, solely using single image observations (Jones, 2020; Shapovalov et al., 2020), unavailable or wrong geolocation preventing habitat analysis (Jones, 2020) and identifying non‐supported taxa (Jones, 2020; Schmidt & Steinecke, 2019). However, Jones (2020) still concludes that the Flora Incognita app is a very valuable tool even for botanists and ecologists during field studies.…”
Section: Resultsmentioning
confidence: 99%
“…Results of the four referenced comparative studies show that Flora Incognita achieves state of the art identification accuracy. We attribute Flora Incognita's varying performance mainly to the studies' experimental protocol, e.g., solely using single image observations (Jones, 2020;Shapovalov et al, 2020), unavailable or wrong geolocation preventing habitat analysis (Jones, 2020), and identifying non-supported taxa (Schmidt and Steinecke, 2019;Jones, 2020 (Jones, 2020)) to gain a repeatable experimental setup but thereby neglecting our multi-image and context analyses that have been demonstrated to substantially improve identification accuracy (Rzanny et al, 2019;Seeland and Mäder, 2021).…”
Section: Identification Accuracymentioning
confidence: 99%
See 1 more Smart Citation
“…Automated species identification is one promising avenue that has been discussed since the potential of machine learning for ecological applications has become evident (Gaston and O'Neill 2004). With the advent of deep learning methods (Goodfellow et al 2016) automated species identification reaches levels of accuracy comparable to human experts (Bonnet et al 2018, Wäldchen and Mäder 2018a, Jones 2020, Villon et al 2020). Today, multiple smartphone apps leverage such algorithms and enable users to identify e.g.…”
Section: Introductionmentioning
confidence: 99%
“…Today, multiple smartphone apps leverage such algorithms and enable users to identify e.g. plants, insects or birds directly in the field (Kumar et al 2012, Affouard et al 2017, Van Horn et al 2018, Wäldchen and Mäder 2018a, Jones 2020). The voluntarily shared ancillary information on time and location could soon turn such mobile observations into an invaluable resource for different monitoring tasks (Bonnet et al 2020).…”
Section: Introductionmentioning
confidence: 99%