2017
DOI: 10.48550/arxiv.1711.10105
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Tensor Completion Algorithms in Big Data Analytics

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
5
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 138 publications
0
5
0
Order By: Relevance
“…In contrast to most works (see our introduction and the overviews [Cichocki et al, 2017, Song et al, 2018), the tensors in this paper only appear after a discretization of the feature space determined by the labelled data set. Discretizations are of course always needed when a discrete tensor approximates an infinite dimensional function.…”
Section: Related Workmentioning
confidence: 99%
“…In contrast to most works (see our introduction and the overviews [Cichocki et al, 2017, Song et al, 2018), the tensors in this paper only appear after a discretization of the feature space determined by the labelled data set. Discretizations are of course always needed when a discrete tensor approximates an infinite dimensional function.…”
Section: Related Workmentioning
confidence: 99%
“…There are other models such as RESCAL [3], Tuckerbased methods [2] etc that have been considered for tensor completion tasks. We refer the interested reader to a recent survey for more information [13].…”
Section: Related Workmentioning
confidence: 99%
“…Most tensor datasets encountered in the above settings are not fully observed. This leads to tensor completion, the problem of predicting the missing entries, given a small number of observations from the tensor [2,13]. In order to recover the missing entries, it is important to take into account the data efficiency of the tensor completion model.…”
Section: Introductionmentioning
confidence: 99%
“…Tensor decomposition, as a multi-dimensional generalization of matrix decomposition, is a multi-decades-old mathematical technique in multi-way data analysis since the 1960s, see [13] and references therein; tensor decompositions are widely applied in areas from signal processing such as blind source separation and multi-modal data fusion to machine learning such as model compression and learning latent variable models [3,64]. Tensor computing recently emerges as a promising solution for big data processing due to its ability to model wide variety of data such as graphical, tabular, discrete, and continuous data [42,66,78]; algorithms to cater for different data quality / veracity or missing data [65] and provide real-time analytics for big data velocity such as streaming analytics [68,69]; and able to capture the complex correlation structure in data with large volume and generate valuable insights for many big data distributed applications [12,14]. Tensor network computing, on the other hand, is a well-established technique among the numerical community; the technique provides unprecedented large-scale scientific computing with performance comparable to competing techniques such as sparse-grid methods [37,38].…”
Section: Introductionmentioning
confidence: 99%