2007
DOI: 10.1016/j.neucom.2006.06.016
|View full text |Cite
|
Sign up to set email alerts
|

Self-organizing multi-modeling: A different way to design intelligent predictors

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2009
2009
2019
2019

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 12 publications
(8 citation statements)
references
References 17 publications
0
8
0
Order By: Relevance
“…The parametric identification is performed to evaluate the parameters of the local models. One can use either a static or a recursive methodology [6].…”
Section: Nonlinear Systems and Multi-model Approachesmentioning
confidence: 99%
“…The parametric identification is performed to evaluate the parameters of the local models. One can use either a static or a recursive methodology [6].…”
Section: Nonlinear Systems and Multi-model Approachesmentioning
confidence: 99%
“…However, it is often found that no single classifier is entirely satisfactory for a particular task, and hence the idea of combining different methods in order to improve overall performance has emerged as potentially very promising. There are some examples of systems that combine classifiers, such as in [6,21,35,38,40,41]. In [40], for instance, the authors proposed an algorithm which estimates the problem complexity, and divide an initial hard classification problem into several easier ones (divide to simplify) and builds a self organized neural tree structure.…”
Section: Ensemble Systemsmentioning
confidence: 99%
“…These major points could be defeated by adding selforganizing skill to a multi-model (or to a modular architecture) leading to powerful structure, especially if local models (or local modules) are ANN based units. A number of such self-organizing multiple ANN structures have been proposed and described in [9], [10], [12] and [13]. These works could be divided into two main directions.…”
Section: Introductionmentioning
confidence: 99%