2016
DOI: 10.1016/j.ins.2016.03.047
|View full text |Cite
|
Sign up to set email alerts
|

Learning of interval and general type-2 fuzzy logic systems using simulated annealing: Theory and practice

Abstract: a b s t r a c tThis paper reports the use of simulated annealing to design more efficient fuzzy logic systems to model problems with associated uncertainties. Simulated annealing is used within this work as a method for learning the best configurations of interval and general type-2 fuzzy logic systems to maximize their modeling ability. The combination of simulated annealing with these models is presented in the modeling of four benchmark problems including real-world problems. The type-2 fuzzy logic system m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
31
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 54 publications
(31 citation statements)
references
References 40 publications
0
31
0
Order By: Relevance
“…In the sixth example, the results of proposed methods are used in the prediction of a chaotic time series when it is subjected to noise with two strategies. The results of these two strategies are compared with the LM algorithm in Khanesar, Kayacan, Teshnehlab, and Kaynak (2011, April), EKF in Khanesar et al (2012), and the SA in Almaraashi et al (2016). In order for the researchers to have a fair comparison of the proposed methods with other ones in terms of training and checking the data, the program has been run 10 times separately.…”
Section: Simulation Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…In the sixth example, the results of proposed methods are used in the prediction of a chaotic time series when it is subjected to noise with two strategies. The results of these two strategies are compared with the LM algorithm in Khanesar, Kayacan, Teshnehlab, and Kaynak (2011, April), EKF in Khanesar et al (2012), and the SA in Almaraashi et al (2016). In order for the researchers to have a fair comparison of the proposed methods with other ones in terms of training and checking the data, the program has been run 10 times separately.…”
Section: Simulation Resultsmentioning
confidence: 99%
“…Optimization using derivative‐based algorithms like gradient descent (GD; Zhao, Li, & Irwin, ), least squares (LS; Huang & Chen, ), Levenberg Marquardt (LM; De Canete, Garcia‐Cerezo, García‐Moral, Del Saz, & Ochoa, ; Jang & Mizutani, ), Kalman filter (KF), and its variants (Barragán, Al‐Hadithi, Jiménez, & Andújar, ; Khanesar, Kayacan, Teshnehlab, & Kaynak, ), and the Simplex method (Wang, Li, & Li, ) is dependent on the derivative information, and the updating rules in the case of derivative‐free algorithms such as genetic algorithm (GA; Sarkheyli, Zain, & Sharif, ), particle swarm optimization (PSO; Elloumi, Krid, & Masmoudi, ; Ghomsheh, Shoorehdeli, & Teshnehlab, ; Maldonado, Castillo, & Melin, ), adaptive Bee colony (Habbi, Boudouaoui, Karaboga, & Ozturk, ; Bagis & Konar., ), Ant colony optimization (Juang & Hsu, ; Juang, Hung, & Hsu, ), Search group algorithm (Noorbin & Alfi, ), simulated annealing (SA; Almaraashi et al, ; Almaraashi, John, Hopgood, & Ahmadi, ), and water cycle algorithm (Pahnehkolaei, Alfi, Sadollah, & Kim, ) are not reliant on the functional derivative.…”
Section: Introductionmentioning
confidence: 99%
“…First, the application of the gradient descent algorithm can be difficult for more complex and large-scale problems due to its slow convergence and susceptibility to a local minimum of error function. Therefore, further research might explore the adaptation of the IVIFIS by using evolutionary algorithms, in a similar manner as for IT2FLSs [23], [24]. Second, several premise parameters were fixed for the sake of simplicity, such as the hesitancy degrees of nonmembership functions and the widths of the IVIFSs.…”
Section: Discussionmentioning
confidence: 99%
“…The main rationale for its selection relies on it easy-to-implement probabilistic algorithm able to yield very good solutions for a wide variety of problems. A thorough study on the fast convergence and relatively low complexity of the algorithm is out of the scope of this paper but it has been recently reported [31]. The SA algorithm starts with initial modeling parameters HIM = [ 0 , 0 , 0 ] and evaluates the MAPE performance index.…”
Section: Hybrid Incremental Modeling Based On the Optimal Settingmentioning
confidence: 99%