2010
DOI: 10.1007/s13042-010-0008-6
|View full text |Cite
|
Sign up to set email alerts
|

An efficient gene selection technique for cancer recognition based on neighborhood mutual information

Abstract: Gene selection is a key problem in gene expression based cancer recognition and related tasks. A measure, called neighborhood mutual information (NMI), is introduced to evaluate the relevance between genes and related decision in this work. Then the measure is combined with the search strategy of minimal redundancy and maximal relevancy (mRMR) for constructing a NMI based mRMR gene selection algorithm (NMI_mRMR). In addition, it is also found that the first k best genes with respect to NMI are usually enough f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
43
0

Year Published

2011
2011
2022
2022

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 103 publications
(43 citation statements)
references
References 36 publications
0
43
0
Order By: Relevance
“…Mutating the particle at a certain probability when the diversity decreases, which can restrain the premature convergence of particle and allows the particles to jump out of local optimum in time to find a better solution; on the contrary, that makes the particles have the ability to explore new areas from the early stages of iteration to the end. The calculation formula of particle information entropy [32] is as follows:…”
Section: Information Entropymentioning
confidence: 99%
“…Mutating the particle at a certain probability when the diversity decreases, which can restrain the premature convergence of particle and allows the particles to jump out of local optimum in time to find a better solution; on the contrary, that makes the particles have the ability to explore new areas from the early stages of iteration to the end. The calculation formula of particle information entropy [32] is as follows:…”
Section: Information Entropymentioning
confidence: 99%
“…Neighborhood mutual information is defined in the literature [8,9]. information of R and S is defined as…”
Section: Feature Selection Algorithm Based On Neighborhood Mutual Infmentioning
confidence: 99%
“…Neighborhood mutual information (NMI) is an effective measuring method to avoid MI disadvantages. It is constructed by integrating the concept of a neighborhood into Shannon's information theory, and it is a natural generalization of MI in numerical feature spaces [8,9].…”
Section: Introductionmentioning
confidence: 99%
“…Large amount of following studies have successively introduced the concepts of the expansion of Hartley entropy and Shannon entropy [16], relative entropy [17], cumulative residual entropy [18][19][20][21], joint entropy [22,23], conditional entropy [24][25][26], mutual information [27][28][29][30][31][32], cross entropy [33][34][35][36][37][38], fuzzy entropy [15,39], maximum entropy principle [40,41] and minimum cross-entropy principle [42,43], and a series of achievements have been made in these aspects. Zhong makes use of general information functions to unify the methods of describing information metrics with Entropy formulas [4].…”
Section: About the Metrics Of Informationmentioning
confidence: 99%