2010
DOI: 10.1007/s13042-010-0006-8
|View full text |Cite
|
Sign up to set email alerts
|

An improved multiple fuzzy NNC system based on mutual information and fuzzy integral

Abstract: Multiple nearest neighbor classifier system (MNNCS) is a popular method to relax the curse of dimensionality. In previous work, most of the MNNCSs are designed by random methods. Random methods may generate unstable component classifiers. In order to relax the randomness, large amount of component classifiers are needed. This paper first extends nearest neighbor classifier into fuzzy nearest neighbor classifier, and proposes a new multiple fuzzy nearest neighbor classifier system based on mutual information an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
21
0

Year Published

2011
2011
2023
2023

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 56 publications
(21 citation statements)
references
References 25 publications
0
21
0
Order By: Relevance
“…Using s, we propose the GenA measure at (14), where the binomial coefficient term n K −1 accounts for the number of possible K-tuples inh. Equation (14) can be written in a similar spirit as (9),…”
Section: B Measure Of Generalized Accordmentioning
confidence: 99%
See 2 more Smart Citations
“…Using s, we propose the GenA measure at (14), where the binomial coefficient term n K −1 accounts for the number of possible K-tuples inh. Equation (14) can be written in a similar spirit as (9),…”
Section: B Measure Of Generalized Accordmentioning
confidence: 99%
“…The GenA FM for FN-valued inputs can be directly calculated by substituting a FN similarity function, such as those at (30), into the formulation at (14), where the input to the GenA measure is now the set of FNs i . Furthermore, the same recursive method proposed for the interval-valued GenA measure can be used for FN-valued inputs.…”
Section: B Measure Of Generalized Accordmentioning
confidence: 99%
See 1 more Smart Citation
“…Large amount of following studies have successively introduced the concepts of the expansion of Hartley entropy and Shannon entropy [16], relative entropy [17], cumulative residual entropy [18][19][20][21], joint entropy [22,23], conditional entropy [24][25][26], mutual information [27][28][29][30][31][32], cross entropy [33][34][35][36][37][38], fuzzy entropy [15,39], maximum entropy principle [40,41] and minimum cross-entropy principle [42,43], and a series of achievements have been made in these aspects. Zhong makes use of general information functions to unify the methods of describing information metrics with Entropy formulas [4].…”
Section: About the Metrics Of Informationmentioning
confidence: 99%
“…Tong and Mitram [39] use genetic algorithms to choose between neural network activation functions and features to increase classifier performance. Wang [43] uses mutual information for feature selection in combining nearest neighbor classifiers using the fuzzy integral. Ensemble algorithms have been proposed and applied to different problems in different domains [48,47,26,52].…”
Section: Introductionmentioning
confidence: 99%