2017 25th European Signal Processing Conference (EUSIPCO) 2017
DOI: 10.23919/eusipco.2017.8081290
|View full text |Cite
|
Sign up to set email alerts
|

Nonlinear least squares updating of the canonical polyadic decomposition

Abstract: Abstract-Current batch tensor methods often struggle to keep up with fast-arriving data. Even storing the full tensors that have to be decomposed can be problematic. To alleviate these limitations, tensor updating methods modify a tensor decomposition using efficient updates instead of recomputing the entire decomposition when new data becomes available. In this paper, the structure of the decomposition is exploited to achieve fast updates for the canonical polyadic decomposition whenever new slices are added … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
26
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 23 publications
(27 citation statements)
references
References 17 publications
1
26
0
Order By: Relevance
“…However, the same approach can also be applied on datasets with fewer or more electrodes without adjustments. Finally, the tensor decomposition can be updated in an efficient way whenever a new batch of EEG data is available [35]. This allows real time tracking of neonatal sleep states.…”
Section: Discussionmentioning
confidence: 99%
“…However, the same approach can also be applied on datasets with fewer or more electrodes without adjustments. Finally, the tensor decomposition can be updated in an efficient way whenever a new batch of EEG data is available [35]. This allows real time tracking of neonatal sleep states.…”
Section: Discussionmentioning
confidence: 99%
“…The ML approach, as mentioned, is widely used for existing tensor methods and can be solved using alternating optimization [2,62], all-at-once optimization [63], or non-linear least squares [64,65]. Constraints such as non-negativity can be imposed using either active set procedures [66], or alternating direction method of multipliers [67,68].…”
Section: Maximum Likelihoodmentioning
confidence: 99%
“…RBS can achieve a high accuracy through step restriction strategies. Updating algorithms can be used to track tensor decompositions that change over time [98,99,100], but can also be used to decompose large-scale tensors as follows. Rather than loading the whole tensor into memory at once, a smaller subtensor is decomposed first.…”
Section: Sampling: Incompleteness Randomization and Updatingmentioning
confidence: 99%
“…This process is then repeated until all slices have been added. Hence, at any given iteration, only the factorization constructed using the previous slices and one new slice are in memory; see [100].…”
Section: Sampling: Incompleteness Randomization and Updatingmentioning
confidence: 99%