2020
DOI: 10.1177/1094342020918305
|View full text |Cite
|
Sign up to set email alerts
|

A parallel hierarchical blocked adaptive cross approximation algorithm

Abstract: This paper presents a low-rank decomposition algorithm assuming any matrix element can be computed in O(1) time. The proposed algorithm first computes rank-revealing decompositions of sub-matrices with a blocked adaptive cross approximation (BACA) algorithm, and then applies a hierarchical merge operation via truncated singular value decompositions (H-BACA). The proposed algorithm significantly improves the convergence of the baseline ACA algorithm and achieves reduced computational complexity compared to the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 18 publications
(7 citation statements)
references
References 29 publications
0
7
0
Order By: Relevance
“…While we presented a preliminary analysis of the multiplicative update scheme's convergence behavior for a special case, future work is necessary for a thorough determination of the algorithm's region of convergence. There is also strong practical interest in the adaptation of the method to GPR extensions such as those based on non-Gaussian likelihoods and Nystr öm [36,37] or hierarchical low-rank approximations [38,39,40], as well as in Table 1: Label Noise -Rate/Level: the percentage of corrupted labels and the ratio between the noise and the standard deviation of the pristine labels; R 2 : the coefficient of determination between the inferred and actual label noise; AUC: area under the ROC curve of a 'noisy label' classifier that thresholds the learned σ i ; Precision at Recall Level: precision of the classifier at specified recall levels. Regression accuracy -plain/basic/full: Σ = 0, σI, diag (σ), respectively.…”
Section: Discussionmentioning
confidence: 99%
“…While we presented a preliminary analysis of the multiplicative update scheme's convergence behavior for a special case, future work is necessary for a thorough determination of the algorithm's region of convergence. There is also strong practical interest in the adaptation of the method to GPR extensions such as those based on non-Gaussian likelihoods and Nystr öm [36,37] or hierarchical low-rank approximations [38,39,40], as well as in Table 1: Label Noise -Rate/Level: the percentage of corrupted labels and the ratio between the noise and the standard deviation of the pristine labels; R 2 : the coefficient of determination between the inferred and actual label noise; AUC: area under the ROC curve of a 'noisy label' classifier that thresholds the learned σ i ; Precision at Recall Level: precision of the classifier at specified recall levels. Regression accuracy -plain/basic/full: Σ = 0, σI, diag (σ), respectively.…”
Section: Discussionmentioning
confidence: 99%
“…At the third and the last step, the Adaptive Cross Approximation [45] is used for admissible interactions, completed by the standard full computation for close interactions. For the evaluation of the convergence of the ACA algorithm, we use the similar criterion in [45].…”
Section: Tensor Formulation Using Rbf Descritization Consider Linear ...mentioning
confidence: 99%
“…At the third and the last step, the Adaptive Cross Approximation [45] is used for admissible interactions, completed by the standard full computation for close interactions. For the evaluation of the convergence of the ACA algorithm, we use the similar criterion in [45]. The distances between each set of particles X I and Y J are evaluated from their projections on the axis defined by the two centres of each dataset.…”
Section: Tensor Formulation Using Rbf Descritization Consider Linear ...mentioning
confidence: 99%
“…The approximation algorithm refers to the scenario or accuracy of using the relevant algorithm to solve some practical problems, and the solution given is the theoretical optimal solution. In the reduction of the sample set, the approximation algorithm refers to giving the smallest sample set of samples as accurately as possible, and obtaining high-quality samples is the only standard of the approximation algorithm (Liu et al, 2020).…”
Section: Approximation Algorithmmentioning
confidence: 99%