2015
DOI: 10.17265/2159-581x/2015.01.004
|View full text |Cite
|
Sign up to set email alerts
|

A Review on Back-Propagation Neural Networks in the Application of Remote Sensing Image Classification

Abstract: ANNs (Artificial neural networks) are used extensively in remote sensing image processing. It has been proven that BPNNs (back-propagation neural networks) have high attainable classification accuracy. However, there is a noticeable variation in the achieved accuracies due to different network designs and implementations. Hence, researchers usually need to conduct several experimental trials before they can finalize the network design. This is a time consuming process which significantly reduces the effectiven… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
8
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
2
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 13 publications
(8 citation statements)
references
References 11 publications
0
8
0
Order By: Relevance
“…MLP is an ANN for learning patterns through backpropagation [39], [40], [41]. With layers of interconnected neurons and non-linear activation functions, MLP is versatile in approximating non-linear relationships (Baraldi et al, 2001;Suliman and Zhang, 2015). Hyperparameters contribute to model complexity, non-linear mapping, convergence speed, and preventing overfitting [42].…”
Section: A Machine Learningmentioning
confidence: 99%
“…MLP is an ANN for learning patterns through backpropagation [39], [40], [41]. With layers of interconnected neurons and non-linear activation functions, MLP is versatile in approximating non-linear relationships (Baraldi et al, 2001;Suliman and Zhang, 2015). Hyperparameters contribute to model complexity, non-linear mapping, convergence speed, and preventing overfitting [42].…”
Section: A Machine Learningmentioning
confidence: 99%
“…Agronomy 2024, 14, 719 7 of 24 algorithm. Thus, the back propagation utilises the error function to adjust the weight of each input to neurons in order to gradually reduce the error, which is the difference between the actual ANN's and the sought output [97,98]. The activation function introduces non-linearity to the data.…”
Section: Artificial Neural Networkmentioning
confidence: 99%
“…The learning process is made possible by the back propagation algorithm. Thus, the back propagation utilises the error function to adjust the weight of each input to neurons in order to gradually reduce the error, which is the difference between the actual ANN's and the sought output [97,98].…”
Section: Artificial Neural Networkmentioning
confidence: 99%
“…The neural networks used in this chapter are sigmoidal neurons trained with backpropagation [17]. Other functioning details are given in [12,13].…”
Section: Artificial Neural Networkmentioning
confidence: 99%