2022
DOI: 10.1109/lgrs.2022.3172410
|View full text |Cite
|
Sign up to set email alerts
|

Hyperspectral Image Classification of Brain-Inspired Spiking Neural Network Based on Attention Mechanism

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 15 publications
0
6
0
Order By: Relevance
“…Involving recalibration of the spectrum features. Numerous scholars have successfully integrated attention mechanisms into spectral similarity measurements, effectively aggregating similar spectra and optimize cognitive performance of convolutional neural networks [44], [45], [46], [47].…”
Section: A Prerequisitementioning
confidence: 99%
“…Involving recalibration of the spectrum features. Numerous scholars have successfully integrated attention mechanisms into spectral similarity measurements, effectively aggregating similar spectra and optimize cognitive performance of convolutional neural networks [44], [45], [46], [47].…”
Section: A Prerequisitementioning
confidence: 99%
“…Rate coding is coded according to the firing frequency of the spike trains, where the firing frequency is proportional to the intensity of the input pixel values. Most of the current image classification work of the SNN adopts this coding scheme [44,45,46]. Fig.…”
Section: Spiking Encoder Schemesmentioning
confidence: 99%
“…In order to further verify the advantages of the framework on time steps, we compare the classification effects of the proposed model with SNN-SSEM [44] The spiking time step is an important metric to measure the SNN model, The fewer time steps, the higher the accuracy, indicating that the SNN has a strong learning ability. It can be seen from the experimental results that the proposed model can learn the general characteristics of the HSIs over 5 time steps.…”
Section: B Comparison With Spiking Neural Networkmentioning
confidence: 99%
See 2 more Smart Citations