2012
DOI: 10.1371/journal.pone.0040093
|View full text |Cite
|
Sign up to set email alerts
|

Integrating Local and Global Error Statistics for Multi-Scale RBF Network Training: An Assessment on Remote Sensing Data

Abstract: BackgroundThis study discusses the theoretical underpinnings of a novel multi-scale radial basis function (MSRBF) neural network along with its application to classification and regression tasks in remote sensing. The novelty of the proposed MSRBF network relies on the integration of both local and global error statistics in the node selection process.Methodology and Principal FindingsThe method was tested on a binary classification task, detection of impervious surfaces using a Landsat satellite image, and a … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 67 publications
0
3
0
Order By: Relevance
“…The core of classifier construction is to select the correct kernel function. The Radial Basis Function (RBF) kernel can project nonlinear matters into high-latitude space and then establish linear functions to solve nonlinear issues [ 16 ]. The calculation method of the RBF kernel is shown in equation ( 6 ), where γ represents the distance of the horizontal axis: …”
Section: Methodsmentioning
confidence: 99%
“…The core of classifier construction is to select the correct kernel function. The Radial Basis Function (RBF) kernel can project nonlinear matters into high-latitude space and then establish linear functions to solve nonlinear issues [ 16 ]. The calculation method of the RBF kernel is shown in equation ( 6 ), where γ represents the distance of the horizontal axis: …”
Section: Methodsmentioning
confidence: 99%
“…Second, we also compared the predictive results of the proposed method with that of four other commonly used classifiers, i.e., Naïve Bayes 20 , Logistic Function 21 , RBFNetwork 22 , and Random Forest 23 as implemented in WEKA 24 . The jackknife test results for identifying m 6 A sites in the benchmark dataset for different classifiers were listed in Table 3 .…”
Section: Resultsmentioning
confidence: 99%
“…When used for classifying problems, there are three important factors for evaluating network performance: 1) classifying accuracy, 2) network size, and 3) training time. To achieve good network performance, different optimization algorithms are used to train the RBF hidden layer, such as K-means clustering [1, 2], fuzzy C-means clustering [3, 4], fuzzy K-nearest neighbors [5], differential evolution [6, 7], and other optimization algorithms [812]. However, in most of these methods, the number of RBF hidden nodes is assigned a priori, which may lead to poor adaptability for different sample sets.…”
Section: Introductionmentioning
confidence: 99%