2005
DOI: 10.1007/s11336-002-1032-6
|View full text |Cite
|
Sign up to set email alerts
|

Optimal Least-Squares Unidimensional Scaling: Improved Branch-and-Bound Procedures and Comparison to Dynamic Programming

Abstract: combinatorial data analysis, least-squares unidimensional scaling, branch-and-bound, dynamic programming,

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
28
0

Year Published

2006
2006
2015
2015

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 21 publications
(28 citation statements)
references
References 29 publications
0
28
0
Order By: Relevance
“…Accordingly, two-mode KL-means clustering and nonnegative matrix factorization can be applied for analyzing these data. Since their original publication, these lipread consonant data have been reanalyzed using clustering or seriation methods (Brusco & Stahl, 2005b;. However, the latter studies analyzed the data subsequent to transforming the asymmetric confusion proportions to a symmetric matrix.…”
Section: Example 1 Lipread Consonant Datamentioning
confidence: 99%
See 1 more Smart Citation
“…Accordingly, two-mode KL-means clustering and nonnegative matrix factorization can be applied for analyzing these data. Since their original publication, these lipread consonant data have been reanalyzed using clustering or seriation methods (Brusco & Stahl, 2005b;. However, the latter studies analyzed the data subsequent to transforming the asymmetric confusion proportions to a symmetric matrix.…”
Section: Example 1 Lipread Consonant Datamentioning
confidence: 99%
“…For any given pair of objects (i, j), asymmetry can potentially occur because a presented stimulus i could be mistaken more frequently for j than stimulus j would be mistaken for i, or vice versa. Likewise, for brand-switching applications in consumer psychology, where x ij (for i ≠ j) is a measure reflecting the Brusco, 2001;Brusco & Stahl, 2005b;DeCani, 1972;Flueck & Korsh, 1974;Hubert, 1976;Hubert et al, 2001, chap. 4;Ranyard, 1976).…”
mentioning
confidence: 99%
“…The Robinson (anti-Robinson) patterning of a dissimilarity matrix, which is named for the pioneering work of Robinson (1951), is characterized by never-increasing (never-decreasing) elements in the rows and columns of the matrix when moving away from the main diagonal. The importance of anti-Robinson structures stems from a variety of important applications in combinatorial data analysis, including: (a) anti-Robinson patterning is a necessary, although not sufficient, condition for a perfect unidimensional scaling of objects (see Brusco & Stahl, 2005a for a discussion), (b) anti-Robinson patterning enables representation of a dissimilarity matrix via subsets of increasing diameter (Hubert, Arabie, & Meulman, 1998a, 1998b, 2006, and (c) anti-Robinson patterning is instrumental in the fitting of ultrametric trees (Hubert & Arabie, 1995;Hubert et al, 2006, chapter 5).…”
Section: Inducing Anti-robinson Structure On Symmetric Matricesmentioning
confidence: 99%
“…Although we could have opted to use a branch-and-bound method to attempt to improve locally-optimal permutations (Brusco & Stahl 2005a, 2005b, chapters 8 and 10), we selected the dynamic programming approach because its CPU time is less affected by the structural properties of the proximity matrix. For example, in their comparison of various branch-and-bound methods to dynamic programming, Brusco and Stahl (2005a) found comparable CPU times when the symmetric dissimilarity matrices exhibited structural properties that were conducive to a unidimensional scale.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation