Most of the dynamics in real-world systems are compiled by shifts and drifts, which are uneasy to be overcome by omnipresent neuro-fuzzy systems. Nonetheless, learning in nonstationary environment entails a system owning high degree of flexibility capable of assembling its rule base autonomously according to the degree of nonlinearity contained in the system. In practice, the rule growing and pruning are carried out merely benefiting from a small snapshot of the complete training data to truncate the computational load and memory demand to the low level. An exposure of a novel algorithm, namely parsimonious network based on fuzzy inference system (PANFIS), is to this end presented herein. PANFIS can commence its learning process from scratch with an empty rule base. The fuzzy rules can be stitched up and expelled by virtue of statistical contributions of the fuzzy rules and injected datum afterward. Identical fuzzy sets may be alluded and blended to be one fuzzy set as a pursuit of a transparent rule base escalating human's interpretability. The learning and modeling performances of the proposed PANFIS are numerically validated using several benchmark problems from real-world or synthetic datasets. The validation includes comparisons with state-of-the-art evolving neuro-fuzzy methods and showcases that our new method can compete and in some cases even outperform these approaches in terms of predictive fidelity and model complexity.
In this paper, we introduce a new algorithm for incremental learning of a specific form of Takagi-Sugeno fuzzy systems proposed by Wang and Mendel in 1992. The new data-driven online learning approach includes not only the adaptation of linear parameters appearing in the rule consequents, but also the incremental learning of premise parameters appearing in the membership functions (fuzzy sets), together with a rule learning strategy in sample mode. A modified version of vector quantization is exploited for rule evolution and an incremental learning of the rules' premise parts. The modifications include an automatic generation of new clusters based on the nature, distribution, and quality of new data and an alternative strategy for selecting the winning cluster (rule) in each incremental learning step. Antecedent and consequent learning are connected in a stable manner, meaning that a convergence toward the optimal parameter set in the least-squares sense can be achieved. An evaluation and a comparison to conventional batch methods based on static and dynamic process models are presented for high-dimensional data recorded at engine test benches and at rolling mills. For the latter, the obtained data-driven fuzzy models are even compared with an analytical physical model. Furthermore, a comparison with other evolving fuzzy systems approaches is carried out based on nonlinear dynamic system identification tasks and a three-input nonlinear function approximation example.Index Terms-Convergence to optimality, incremental clustering, robust evolving fuzzy models, static and dynamic process modeling, Takagi-Sugeno fuzzy systems.
In this paper, a novel evolving fuzzy-rule-based classifier, termed parsimonious classifier (pClass), is proposed. pClass can drive its learning engine from scratch with an empty rule base or initially trained fuzzy models. It adopts an open structure and plug and play concept where automatic knowledge building, rule-based simplification, knowledge recall mechanism, and soft feature reduction can be carried out on the fly with limited expert knowledge and without prior assumptions to underlying data distribution. In this paper, three state-of-the-art classifier architectures engaging multi-input-multi-output, multimodel, and round robin architectures are also critically analyzed. The efficacy of the pClass has been numerically validated by means of real-world and synthetic streaming data, possessing various concept drifts, noisy learning environments, and dynamic class attributes. In addition, comparative studies with prominent algorithms using comprehensive statistical tests have confirmed that the pClass delivers more superior performance in terms of classification rate, number of fuzzy rules, and number of rule-base parameters.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.