1996
DOI: 10.1162/neco.1996.8.4.869
|View full text |Cite
|
Sign up to set email alerts
|

Engineering Multiversion Neural-Net Systems

Abstract: In this paper we address the problem of constructing reliable neural-net implementations, given the assumption that any particular implementation will not be totally correct. The approach taken in this paper is to organize the inevitable errors so as to minimize their impact in the context of a multiversion system, i.e., the system functionality is reproduced in multiple versions, which together will constitute the neural-net system. The unique characteristics of neural computing are exploited in order to engi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
86
0
5

Year Published

2002
2002
2018
2018

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 189 publications
(91 citation statements)
references
References 10 publications
0
86
0
5
Order By: Relevance
“…From this point of view heuristic rules, as the "choose the best" or the "choose the best in the class", using classifiers of different types strongly reduce the computational complexity of the selected phase, as the evaluation of different classifier subsets is not required [103]. Moreover test and select methods implicitly include a "production stage", by which a set of classifiers must be generated.…”
Section: Mixtures Of Experts Methodsmentioning
confidence: 99%
“…From this point of view heuristic rules, as the "choose the best" or the "choose the best in the class", using classifiers of different types strongly reduce the computational complexity of the selected phase, as the evaluation of different classifier subsets is not required [103]. Moreover test and select methods implicitly include a "production stage", by which a set of classifiers must be generated.…”
Section: Mixtures Of Experts Methodsmentioning
confidence: 99%
“…In general, the selection of a subset of classifiers is done using the OCS strategy [13,30], in which a large set of classifiers is produced and then selected to extract the best performing subset. GAs are a popular technique within this strategy.…”
Section: Related Work On Genetic Selection Of Mcssmentioning
confidence: 99%
“…As a consequence, fuzzy rulebased multiclassification systems (FRBMCSs) were incorporated into an overproduce-and-choose strategy (OCS) [13]. This MCS desing algorithm is based on the generation of a large number of component classifiers, and a subsequent selection of the subset of them best cooperating.…”
Section: Introductionmentioning
confidence: 99%
“…BAG: The original Bagging algorithm proposed in [7]. We train decision tree ensembles of size 5,10,15,20,25,30 and choose the one with the best val-B accuracy.…”
Section: Comparison With Adaboost and Baggingmentioning
confidence: 99%
“…That is, we are interested in both pruning the inaccurate ones and also to keep a check on complexity, we want to prune the redundant. ''Diversity'' measures have been proposed [23,22] and one possibility is to have an incremental, forward search where we add a classifier to an ensemble if it is diverse or adds to accuracy [9,11,35,49,42], or another possibility is to have a decremental, backward search where a classifier is removed or pruned if it is not diverse enough or if its removal does not increase error [30,27]. In this work, we propose an alternative method which combines base classifiers using principal component analysis (PCA) to get uncorrelated eigenclassifiers.…”
Section: Introductionmentioning
confidence: 99%