2016 IEEE International Conference on Big Data (Big Data) 2016
DOI: 10.1109/bigdata.2016.7841074
|View full text |Cite
|
Sign up to set email alerts
|

Sequential randomized matrix factorization for Gaussian processes

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(9 citation statements)
references
References 4 publications
0
9
0
Order By: Relevance
“…For higher levels, parallelization must rely on parallel implementation of matrix multiplication. Another factor that contributes to possible parallelization is simplicity of the method. Performance on streaming data: In many applications (e.g., Gaussian Processes and other correlation analysis), it is typical that the entries of the matrix X are not available all at once, but come one column a time. In such a case, our scheme allows one to process data as they come, effectively scanning the binary tree bottom‐to‐top along the branches and releasing intermediate results of factorization each time when the number of columns reaches 2 i q , i =1,….…”
Section: Resultsmentioning
confidence: 99%
See 4 more Smart Citations
“…For higher levels, parallelization must rely on parallel implementation of matrix multiplication. Another factor that contributes to possible parallelization is simplicity of the method. Performance on streaming data: In many applications (e.g., Gaussian Processes and other correlation analysis), it is typical that the entries of the matrix X are not available all at once, but come one column a time. In such a case, our scheme allows one to process data as they come, effectively scanning the binary tree bottom‐to‐top along the branches and releasing intermediate results of factorization each time when the number of columns reaches 2 i q , i =1,….…”
Section: Resultsmentioning
confidence: 99%
“…Lemma 1. Let the parameters r j be defined by (8), (9), and q ≥ r. Then, the total computational cost of the algorithm is as following.…”
Section: Asymptotical Computational Cost Of the Algorithmmentioning
confidence: 99%
See 3 more Smart Citations