2015
DOI: 10.1109/tfuzz.2014.2315656
|View full text |Cite
|
Sign up to set email alerts
|

Interval Type-2 Radial Basis Function Neural Network: A Modeling Framework

Abstract: In this paper, an interval type-2 radial basis function neural network (IT2-RBF-NN) is proposed as a new modeling framework. We take advantage of the functional equivalence of radial basis function neural networks (RBF-NNs) to a class of type-1 fuzzy logic systems (T1-FLS) to propose a new interval type-2 equivalent system; it is systematically shown that the type equivalence (between RBF and FLS) of the new modeling structure is maintained in the case of the IT2 system. The new IT2-RBF-NN incorporates interva… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
57
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
2
2

Relationship

2
6

Authors

Journals

citations
Cited by 70 publications
(57 citation statements)
references
References 49 publications
0
57
0
Order By: Relevance
“…This concept realises an enhancement layer linking the input layer to the output layer -consistent with the original concept of the RVFLN. Note that recently developed RVFLNs in the literature mostly neglect the direct connection because they are designed with a zero-order output node [8], [11], [14], [15], [17], [41], [50], [51], [63], [66], [67] . The direct connection expands the output node to a higher degree of freedom, which aims to improve the local mapping aptitude of the output node.…”
Section: IImentioning
confidence: 99%
“…This concept realises an enhancement layer linking the input layer to the output layer -consistent with the original concept of the RVFLN. Note that recently developed RVFLNs in the literature mostly neglect the direct connection because they are designed with a zero-order output node [8], [11], [14], [15], [17], [41], [50], [51], [63], [66], [67] . The direct connection expands the output node to a higher degree of freedom, which aims to improve the local mapping aptitude of the output node.…”
Section: IImentioning
confidence: 99%
“…Thus, the RBF-NN exploits such a granular signature in order to discriminate the role of each fuzzy set and the input variables while preserving a balance between transparency and interpretability [1]. The parameter identification of the EDDFN follows a Negative Correlation Learning (NCL) [7] and an Adaptive Back Error Propagation (ABEP) approach [11] that we call for short Adaptive Negative Correlation Learning (ANCL). The NCL introduces a penalty term in the cost function of each individual DDFM minimising its Mean Square Error (MSE) together with the correlation of the ensemble network so that every DDFM is finally trained by the ANCL.…”
Section: A Overview Of the Monitoring System Datamentioning
confidence: 99%
“…For the training of the NFM-T1, IT2-RBF-NN 50% and 100% out of the total instances were employed. According to [6] and [7], two different phases are involved in order to estimate the final parameters of the NFM-T1 and the IT2-RBF-NN. Firstly, the initial parameters of the NFM-T1 and IT2-RBF-NN are computed by granulating the training data set (50% and 100%), and then a second step based on an adaptive gradient descent approach is employed to optimise the fuzzy inference mechanism.…”
Section: ) Iris Plant Classificationmentioning
confidence: 99%
“…In other words, the theoretical structure of granulation involves integration of parts into a whole where granules are drawn together by specific canonical forms usually viewed as fuzzy constraints that lead to a conclusion expressed in natural language [2]. Within this context, several efforts have been devoted to data compression algorithms based on granulation that are able not only to group compatible data but also to capture relationships-rules within information [3][4][5][6][7][8][9]. For instance, in [2] the authors developed a granulation mechanism that captures data relationships in the form of information granules (hyperboxes), while also emphasising process transparency.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation