ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2021
DOI: 10.1109/icassp39728.2021.9415104
|View full text |Cite
|
Sign up to set email alerts
|

Rank-Revealing Block-Term Decomposition for Tensor Completion

Abstract: The so-called block-term decomposition (BTD) tensor model has been recently receiving increasing attention due to its enhanced ability of representing systems and signals that are composed of blocks of rank higher than one, a scenario encountered in numerous and diverse applications. In this paper, BTD is employed for the completion of a tensor from its partially observed entries. A novel method is proposed, which is based on the idea of imposing column sparsity jointly on the BTD factors and in a hierarchical… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 20 publications
0
5
0
Order By: Relevance
“…The effectiveness in both selecting and tracking the correct BTD model was clearly demonstrated via simulation results. Future research can be directed towards extending the novel online BTD algorithm to incorporate constraints and side information [69] and perform completion [66] and DL [11] tasks. Modifications necessary for solving large-scale problems (using, for example, sampling/sketching) [54], [80] or being robust (to outliers) [12], [43] are also worth exploring.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The effectiveness in both selecting and tracking the correct BTD model was clearly demonstrated via simulation results. Future research can be directed towards extending the novel online BTD algorithm to incorporate constraints and side information [69] and perform completion [66] and DL [11] tasks. Modifications necessary for solving large-scale problems (using, for example, sampling/sketching) [54], [80] or being robust (to outliers) [12], [43] are also worth exploring.…”
Section: Discussionmentioning
confidence: 99%
“…The authors show that their method's performance is rather robust to overestimates of these ranks. However, this may not be always the case depending on the application (see [66] and references therein) plus that one might want to have a sufficiently accurate estimate of the ranks for the purposes of interpreting the data (as in, e.g., HSI, where the rank signifies the number of endmembers and the block ranks stand for the ranks of the corresponding abundance maps). Moreover, it may be the case in practice that the ranks vary with time.…”
Section: Related Workmentioning
confidence: 99%
“…In the most realistic case of R ≤ N x N y , R can be estimated as the rank of (a chunk of) Y, for example with the aid of SVD [44, Section 5.4], possibly facilitated (via compression) or even replaced by a Gram-Schmidt orthogonalization of Y T , as in, e.g., [44, Section 5.1] or [54]. Alternatively, a rank-revealing version of the previous algorithms is possible, in the spirit of [55] (and of [56] for the case of missing data). Notice that the regularizer in ( 13) is a tight upper bound of the nuclear norm of Y and promotes smoothness (hence implicitly low-rankness) of the H, S factors.…”
Section: The O-ilsp(-svp) Algorithmmentioning
confidence: 99%
“…In parallel, it is well known that block-term decomposition (BTD) can be considered as a combination of CP and Tucker decompositions [35]. In the tensor literature, there are two adaptive BTD algorithms that are able to factorize streaming tensors, namely OnlineBTD [36] and O-BTD-RLS [37]. They are, however, sensitive to data corruption.…”
Section: Introductionmentioning
confidence: 99%