2023
DOI: 10.1016/j.resourpol.2022.103265
|View full text |Cite
|
Sign up to set email alerts
|

Determination of sublevel stoping layout using a network flow algorithm and the MRMR classification system

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 13 publications
0
2
0
Order By: Relevance
“…To assess the impact on each class, some feature selection methods in HSI datasets are compared in the experiment: maximum information minimum redundancy (MRMR) [47], joint mutual information with class correlation (JOMIC) [48], joint mutual information maximization (JMIM) [49], conditional mutual information maximization (CMIM) According to Table 4, the proposed method exhibits significantly higher reductive efficiency than other EA-based feature selection methods. Specifically, it selects less than 20% of the features from the HSI dataset, resulting in the selection of only 42 features out of a total of 176 bands in the KSC dataset while achieving a prominent OA.…”
Section: Comparison With Other Feature Selection Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…To assess the impact on each class, some feature selection methods in HSI datasets are compared in the experiment: maximum information minimum redundancy (MRMR) [47], joint mutual information with class correlation (JOMIC) [48], joint mutual information maximization (JMIM) [49], conditional mutual information maximization (CMIM) According to Table 4, the proposed method exhibits significantly higher reductive efficiency than other EA-based feature selection methods. Specifically, it selects less than 20% of the features from the HSI dataset, resulting in the selection of only 42 features out of a total of 176 bands in the KSC dataset while achieving a prominent OA.…”
Section: Comparison With Other Feature Selection Methodsmentioning
confidence: 99%
“…To assess the impact on each class, some feature selection methods in HSI datasets are compared in the experiment: maximum information minimum redundancy (MRMR) [47], joint mutual information with class correlation (JOMIC) [48], joint mutual information maximization (JMIM) [49], conditional mutual information maximization (CMIM) [50] and shallow-to-deep feature enhancement (SDFE) [51]. The experiments are performed on 10% to 25% of the total features.…”
Section: Comparison With Other Feature Selection Methodsmentioning
confidence: 99%