2015
DOI: 10.1109/tnnls.2015.2396937
|View full text |Cite
|
Sign up to set email alerts
|

MRM-Lasso: A Sparse Multiview Feature Selection Method via Low-Rank Analysis

Abstract: Learning about multiview data involves many applications, such as video understanding, image classification, and social media. However, when the data dimension increases dramatically, it is important but very challenging to remove redundant features in multiview feature selection. In this paper, we propose a novel feature selection algorithm, multiview rank minimization-based Lasso (MRM-Lasso), which jointly utilizes Lasso for sparse feature selection and rank minimization for learning relevant patterns across… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
19
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
5
2
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 60 publications
(19 citation statements)
references
References 35 publications
(48 reference statements)
0
19
0
Order By: Relevance
“…The least absolute shrinkage and selection operator (LASSO) method can reduce the dimensionality of high-dimensional data and obtain a better-fitting model. The LASSO Cox regression model was used to identify the ideal coefficients for each prognostic signature [37,38]. Multivariate Cox regression analysis was employed in the most important survival-related AS events that were selected from each AS type to establish a prognostic signature (PS).…”
Section: Prognostic Signatures For Alternative Splicing Events In Germentioning
confidence: 99%
“…The least absolute shrinkage and selection operator (LASSO) method can reduce the dimensionality of high-dimensional data and obtain a better-fitting model. The LASSO Cox regression model was used to identify the ideal coefficients for each prognostic signature [37,38]. Multivariate Cox regression analysis was employed in the most important survival-related AS events that were selected from each AS type to establish a prognostic signature (PS).…”
Section: Prognostic Signatures For Alternative Splicing Events In Germentioning
confidence: 99%
“…For convenience, we use ∈ × denote i-th column in connectivity matrix S, which characters the connections of region Oi with respect to other brain regions. Also, we arrange all Pearson's correlation values into a N × N matrix = { | = 1, ⋯ , } Instead of calculating the connectivity sij just based on Pearson's correlation c(xi,xj) between observed BOLD signals xi and xj, we optimize the connectivity matrix S by integrating the above three criteria: argmin S || − || + || || * + || || (1) where α and γ are the scalars which balance the strength of the low rank constraint [17] on S (the second term) and the l1 sparsity constraint [18]on S (the third term).…”
Section: Robust Dynamic Functional Connectivitymentioning
confidence: 99%
“…Some examples of multi-view learning algorithms include: multi-view support vector machines [18], multi-view Boosting [19], multi-view k -means [20], and clustering via canonical correlation analysis [21]. However, barring a few exceptions (e.g., multi-view feature selection methods [22], and multi-view representation learning [23]) the vast majority of existing multi-view learning algorithms are not equipped to effectively cope with the high-dimensionality of omics data [17]. Hence, predictive modeling from multi-omics data calls for effective methods for multi-view feature selection or dimensionality reduction.…”
Section: Introductionmentioning
confidence: 99%