2005
DOI: 10.1137/04060593x
|View full text |Cite
|
Sign up to set email alerts
|

Augmented Implicitly Restarted Lanczos Bidiagonalization Methods

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
322
0
2

Year Published

2010
2010
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 299 publications
(326 citation statements)
references
References 26 publications
2
322
0
2
Order By: Relevance
“…This approach proved advantageous in terms of the time needed for the truncation. Alternative ways to compute the truncated SVD are possible and can be found in [16,3,24]. The cost of computing the truncation depends, for example in the truncated SVD approach, on the cost of multiplying with the matrix UV T .…”
Section: Truncation Operator and Matrix Inner Productsmentioning
confidence: 99%
See 1 more Smart Citation
“…This approach proved advantageous in terms of the time needed for the truncation. Alternative ways to compute the truncated SVD are possible and can be found in [16,3,24]. The cost of computing the truncation depends, for example in the truncated SVD approach, on the cost of multiplying with the matrix UV T .…”
Section: Truncation Operator and Matrix Inner Productsmentioning
confidence: 99%
“…Additionally, we have to ensure that the inner products within the iterative solver are computed efficiently. Due to the properties of the trace operator, 3 we are in luck, as…”
Section: Truncation Operator and Matrix Inner Productsmentioning
confidence: 99%
“…In our analysis we used the "irlba" toolbox [4] in R and its implementation of the SVD algorithm [27] to produce a more compact dataset. The SVD method transforms the matrix P as:…”
Section: High Dimensionality and Sparsity Challengementioning
confidence: 99%
“…For inventor citations use a binary patent by inventor matrix with the positive cells representing a citation relationship between a patent and an inventor. Given the size of the matrices (millions by tens of thousands), a SVD approximation algorithm may be preferred, such as Lanczos (Baglama et al 2005). …”
Section: Machine Learning Methodsologiesmentioning
confidence: 99%