2023
DOI: 10.1016/j.patcog.2022.109102
|View full text |Cite
|
Sign up to set email alerts
|

A consistent and flexible framework for deep matrix factorizations

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 30 publications
0
4
0
Order By: Relevance
“…Inspired by the work of [20,25], we combine the deep L p smooth symmetric matrix factorization model, as described in Section 2, with (3) to formulate a comprehensive loss function that incorporates weighted sums of the layer-wise contributions.…”
Section: Algorithm For Dssnmfmentioning
confidence: 99%
See 3 more Smart Citations
“…Inspired by the work of [20,25], we combine the deep L p smooth symmetric matrix factorization model, as described in Section 2, with (3) to formulate a comprehensive loss function that incorporates weighted sums of the layer-wise contributions.…”
Section: Algorithm For Dssnmfmentioning
confidence: 99%
“…To minimize (4), the block coordinate descent (BCD) [15,20,25] is employed, necessitating the blockwise updating of the factor matrices. In this study, we utilize the fast projected gradient method (FPGM) [26] to update the factor matrices U l or V l .…”
Section: Algorithm For Dssnmfmentioning
confidence: 99%
See 2 more Smart Citations