2019
DOI: 10.1109/lsp.2019.2897230
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Non-Negative Matrix Factorization With Adaptive Sparsity and Smoothness Prior

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 12 publications
(3 citation statements)
references
References 25 publications
0
3
0
Order By: Relevance
“…Two data loss scenarios are considered in this paper: random missing (RM) in which entries are missing randomly, and non‐random missing (NM) in which each PMU time series has block missing for randomly selected channels (e.g., 40% of the PMU channels have block missing lasting 40% of the time window T ). The algorithms for comparison include the proposed RFMR algorithms, CSI [25], MC based ADMM [31], low‐rank tensor completion (LRTC) [34], traditional truncated Hankel‐SVD [38], and Bayesian probabilistic matrix factorization (BPMF) [46]. Table 5 shows the results of the imputation task and the following fourfold conclusions can be derived: 1)When the PMU data are accurately observed, the completion performances of the proposed RFMR is not as good as the existing methods in some application scenarios. 2)The interpolation methods represented by CSI perform better recovery in random missing tasks, while BPMF has lower TVEs in non‐random missing recovery.…”
Section: Experiments and Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…Two data loss scenarios are considered in this paper: random missing (RM) in which entries are missing randomly, and non‐random missing (NM) in which each PMU time series has block missing for randomly selected channels (e.g., 40% of the PMU channels have block missing lasting 40% of the time window T ). The algorithms for comparison include the proposed RFMR algorithms, CSI [25], MC based ADMM [31], low‐rank tensor completion (LRTC) [34], traditional truncated Hankel‐SVD [38], and Bayesian probabilistic matrix factorization (BPMF) [46]. Table 5 shows the results of the imputation task and the following fourfold conclusions can be derived: 1)When the PMU data are accurately observed, the completion performances of the proposed RFMR is not as good as the existing methods in some application scenarios. 2)The interpolation methods represented by CSI perform better recovery in random missing tasks, while BPMF has lower TVEs in non‐random missing recovery.…”
Section: Experiments and Analysismentioning
confidence: 99%
“…Two data loss scenarios are considered in this paper: random missing (RM) in which entries are missing randomly, and non-random missing (NM) in which each PMU time series has block missing for randomly selected channels (e.g., 40% of the PMU channels have block missing lasting 40% of the time window T ). The algorithms for comparison include the proposed RFMR algorithms, CSI [25], MC based ADMM [31], low-rank tensor completion (LRTC) [34], traditional truncated Hankel-SVD [38], and Bayesian probabilistic matrix factorization (BPMF) [46]. Table 5 shows the results of the imputation task and the following fourfold conclusions can be derived:…”
Section: Robust Recovery For Data Loss Under Low-snrsmentioning
confidence: 99%
“…In Algorithm 1, line 5 is solved classically since efficient algorithms exist for it [38], and would probably be applied with some kind of regularization since NMF and its variants tend to generally be ill-posed problems [17,39]. It is line 8 that is solved using BLLS (where QAOA would be applied).…”
Section: Non-negative Matrix Factorization (Nmf) and Variantsmentioning
confidence: 99%