2019
DOI: 10.1016/j.patcog.2019.06.003
|View full text |Cite
|
Sign up to set email alerts
|

Manifold regularized discriminative feature selection for multi-label learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
54
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1

Relationship

5
3

Authors

Journals

citations
Cited by 246 publications
(54 citation statements)
references
References 25 publications
0
54
0
Order By: Relevance
“…Seo et al [30] proposed an improved k-cardinality entropy approximationbased criterion of multi-label feature selection and investigated the parameter k. Sun et al [33] presented a mutual information based method which obtained the optimal subset of features via constrained convex optimization. Zhang et al [42] presented an algorithm called MDFS, which exploits label correlations via manifold regularization, then selects optimal features with l 1,2 −norm regularization and convexity. Li et al [17] put forward a multi-label feature section based on granular computing, which granulates label space firstly, then select the optimal feature subset with a multi-label maximal correlation and minimal redundancy criterion.…”
Section: A Multi-label Feature Selection Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Seo et al [30] proposed an improved k-cardinality entropy approximationbased criterion of multi-label feature selection and investigated the parameter k. Sun et al [33] presented a mutual information based method which obtained the optimal subset of features via constrained convex optimization. Zhang et al [42] presented an algorithm called MDFS, which exploits label correlations via manifold regularization, then selects optimal features with l 1,2 −norm regularization and convexity. Li et al [17] put forward a multi-label feature section based on granular computing, which granulates label space firstly, then select the optimal feature subset with a multi-label maximal correlation and minimal redundancy criterion.…”
Section: A Multi-label Feature Selection Methodsmentioning
confidence: 99%
“…In order to verify the validity of FNPRMS and its feasibility, experiments are designed to compare with the other six multilabel feature selection algorithms, including MLNB [45], MDDMspc [44], MDDMproj [44], PMU [16], MLFRS [23] and MDFS [42]. the parameter smooth in MLNB is set up to 1, as recommended in [45].…”
Section: B Experiments Settingsmentioning
confidence: 99%
“…The method of manifold regularized discriminative feature selection 35 mapped the original feature information into a low-dimensional space for capturing the local label correlations, and then constructed the label information constrained by a manifold to explore the global correlations.…”
Section: Related Workmentioning
confidence: 99%
“…All the predicted results will be compared with three traditional algorithms, namely KNN, DT, and SVM. Note that the evaluation metrics involved in the experiments include Hamming loss (HL), Coverage, Ranking loss (RL), Average precision (AVP), and the performance of multi-label algorithm can obtain the objective evaluation data by these metrics [21,22]. For AVP, the higher the average accuracy is, the better the model is, and for the others the smaller the evaluation indexes are, the better the model is.…”
Section: Experimental Design and Evaluationmentioning
confidence: 99%