1976
DOI: 10.1145/355705.355707
|View full text |Cite
|
Sign up to set email alerts
|

A Comparison of Several Bandwidth and Profile Reduction Algorithms

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
28
0

Year Published

1982
1982
2013
2013

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 87 publications
(28 citation statements)
references
References 2 publications
0
28
0
Order By: Relevance
“…Equation (4) is a second-order ordinary differential equation in time and is discretized using the second-order Newmark scheme (Belytschko et al, 2000) wherein a banded LU decomposition is used to solve the system of algebraic equations. The Cuthill-McKee (Cuthill and Mckee, 1969) and Gibbs-Poole-Stockmeyer methods (Gibbs et al, 1976) are used to re-index the nodes in order to create banded matrixes. The details of this solid solver can be found in Zheng (2009).…”
Section: B Vocal Fold Tissue Modelingmentioning
confidence: 99%
“…Equation (4) is a second-order ordinary differential equation in time and is discretized using the second-order Newmark scheme (Belytschko et al, 2000) wherein a banded LU decomposition is used to solve the system of algebraic equations. The Cuthill-McKee (Cuthill and Mckee, 1969) and Gibbs-Poole-Stockmeyer methods (Gibbs et al, 1976) are used to re-index the nodes in order to create banded matrixes. The details of this solid solver can be found in Zheng (2009).…”
Section: B Vocal Fold Tissue Modelingmentioning
confidence: 99%
“…Thus, our aim is to find a second permutation P 2 such that S 2 = P 2 S 1 P ⊤ 2 has as much mass as possible in its block-super and sub diagonals, i.e., as many of the inter cluster interactions as possible are accounted for. Finding such permutations is studied in the literature on Bandwidth Minimization and is known to be NP-Hard [9]. We propose a greedy heuristic suitable for bundle adjustment.…”
Section: Cluster-tridiagonalmentioning
confidence: 99%
“…Each vector is represented as a pair of arrays of same length NNZ (number of nonzero entries); one of integer type for feature/target indices and the other of float type for values (float for possible decaying), with a same array index coupling an (index, value) pair. This data structure can be viewed as a flattened adaptation of Yale sparse matrix representation [10]. We further split the representations of features and targets for fast access of both, particularly the tensor product yi ⊗ xi, the major computational cost for model updates.…”
Section: Feature Vector Generation In O(1n)mentioning
confidence: 99%