2022
DOI: 10.3390/rs14030592
|View full text |Cite
|
Sign up to set email alerts
|

Transformer Neural Network for Weed and Crop Classification of High Resolution UAV Images

Abstract: Monitoring crops and weeds is a major challenge in agriculture and food production today. Weeds compete directly with crops for moisture, nutrients, and sunlight. They therefore have a significant negative impact on crop yield if not sufficiently controlled. Weed detection and mapping is an essential step in weed control. Many existing research studies recognize the importance of remote sensing systems and machine learning algorithms in weed management. Deep learning approaches have shown good performance in m… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
78
0
2

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 133 publications
(82 citation statements)
references
References 52 publications
2
78
0
2
Order By: Relevance
“…In the transformer, assuming that there is a processed sequence different from the CNN that generates the attention map through convolution and pooling [63], the goal of self-attention is to capture the interaction amongst all n entities by encoding each entity in terms of the global contextual information [47,52]. The implementation of the self-attention can be briefly described as a process of mapping queries (Q), keys (K), and values (V) to outputs [46], where queries, keys, and values are generated by projecting the input sequence 1).…”
Section: Self-attention and Multi-head Attentionmentioning
confidence: 99%
See 3 more Smart Citations
“…In the transformer, assuming that there is a processed sequence different from the CNN that generates the attention map through convolution and pooling [63], the goal of self-attention is to capture the interaction amongst all n entities by encoding each entity in terms of the global contextual information [47,52]. The implementation of the self-attention can be briefly described as a process of mapping queries (Q), keys (K), and values (V) to outputs [46], where queries, keys, and values are generated by projecting the input sequence 1).…”
Section: Self-attention and Multi-head Attentionmentioning
confidence: 99%
“…The detailed calculation equation of the attention score can be expressed as follows: Compared to CNN, self-attention increases the receptive field without increasing the computational cost associated with kernel sizes. In addition, self-attention is invariant to permutations and changes in the number of input points, hence it can easily operate on irregular inputs instead of the standard convolution that requires a grid structure [52,64]. In transformers, self-attention is extended to multi-head self-attention by computing h selfattention operations in parallel, in which h is called the head.…”
Section: Self-attention and Multi-head Attentionmentioning
confidence: 99%
See 2 more Smart Citations
“…Reedha et al [86] explained that controlling weeds and food crops is the main crop production and agriculture. Because basically, weeds take the same nutrients as plants.…”
Section: F Weed Controlmentioning
confidence: 99%