2009
DOI: 10.1016/j.patcog.2009.04.008
|View full text |Cite
|
Sign up to set email alerts
|

On the use of small training sets for neural network-based characterization of mixed pixels in remotely sensed hyperspectral images

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
47
1

Year Published

2009
2009
2020
2020

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 91 publications
(48 citation statements)
references
References 32 publications
0
47
1
Order By: Relevance
“…In [23], the authors designed a multi-layer perceptron neural network combined with a Hopfield neural network to deal with nonlinear mixtures. In [6], [24], the authors discussed methods for automatic selection and labeling of training samples. These methods require the networks to be trained using pixels with known abundances, and the quality of the training data may affect the performance notably.…”
Section: Introductionmentioning
confidence: 99%
“…In [23], the authors designed a multi-layer perceptron neural network combined with a Hopfield neural network to deal with nonlinear mixtures. In [6], [24], the authors discussed methods for automatic selection and labeling of training samples. These methods require the networks to be trained using pixels with known abundances, and the quality of the training data may affect the performance notably.…”
Section: Introductionmentioning
confidence: 99%
“…Several techniques have been proposed for such purposes [4], but all of them are very expensive in computational terms. Although these techniques map nicely to high performance computing systems such as commodity clusters [9], these systems are difficult to adapt to on-board processing requirements introduced by applications, such as wildland fire tracking, biological threat detection, monitoring of oil spills, and other types of chemical contamination.…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, the optimal required numbers of hidden layers and neurons are specified by NN experiments depending on trial and error as applied, for instance, in [28,29]. If the required performance is not reached in the test process, this system is continued to do another new experiment.…”
Section: Proposed Systemmentioning
confidence: 99%