2021
DOI: 10.1137/20m1352405
|View full text |Cite
|
Sign up to set email alerts
|

An Alternating Rank-k Nonnegative Least Squares Framework (ARkNLS) for Nonnegative Matrix Factorization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(3 citation statements)
references
References 40 publications
0
3
0
Order By: Relevance
“…2e, g, 3b, and 4h) at the level of individual neurons and trials (Fig. 3c, 4e), we used non-negative TCA 68 with a framework of alternating rank-k non-negativity constrained least squares based on block principal pivoting method 69 . Our analysis was focused on rank 4 as a higher number of latents overfitted the data.…”
Section: Saturating Activation Ratiosmentioning
confidence: 99%
“…2e, g, 3b, and 4h) at the level of individual neurons and trials (Fig. 3c, 4e), we used non-negative TCA 68 with a framework of alternating rank-k non-negativity constrained least squares based on block principal pivoting method 69 . Our analysis was focused on rank 4 as a higher number of latents overfitted the data.…”
Section: Saturating Activation Ratiosmentioning
confidence: 99%
“…In this paper, we combine these approaches to tackle the HPO related to Matrix Decomposition (MDs) in an unsupervised scenario. In particular, we focused our attention on Nonnegative Matrix Factorizations (NMF) and its sparseness constrained variants [8,6,12,19,23,25,34,35,39]. We regard these problems as penalized optimization tasks with penalization coefficients to be considered HPs, and we focus on their proper choice on HPO issue.…”
Section: Introductionmentioning
confidence: 99%
“…Large scale applications in which (NNLS) appears include NMR relaxometry [62], imaging deblurring [4], biometric templates [40], transcriptomics [42], magnetic microscopy [50], sparse hyperspectral unmixing [26], and system identification [14]. The formulation in (NNLS) is also closely related to non-negative matrix factorization [19] and supervised learning methods such as support vector machines [67]. We refer the interested reader to [13] for a survey about the development of algorithms that enforce non-negativity.…”
Section: Introductionmentioning
confidence: 99%