2019 IEEE Winter Conference on Applications of Computer Vision (WACV) 2019
DOI: 10.1109/wacv.2019.00092
|View full text |Cite
|
Sign up to set email alerts
|

DIMAL: Deep Isometric Manifold Learning Using Sparse Geodesic Sampling

Abstract: This paper explores a fully unsupervised deep learning approach for computing distance-preserving maps that generate low-dimensional embeddings for a certain class of manifolds. We use the Siamese configuration to train a neural network to solve the problem of least squares multidimensional scaling for generating maps that approximately preserve geodesic distances. By training with only a few landmarks, we show a significantly improved local and nonlocal generalization of the isometric mapping as compared to a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
18
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 26 publications
(18 citation statements)
references
References 39 publications
0
18
0
Order By: Relevance
“…To overcome the introduction of overlaps with sequential chunking, we developed a seeded-chunking method (as shown in Algorithm 2). The seeded-chunking first uses the “farthest point sampling” algorithm [16, 3] to find l seeds for l chunks, such that the seeds are farthest away from each other. That is, starting from a randomly selected seed, the seeds are chosen one at a time, such that each new seed has the largest distance to the set of already selected seeds (i.e.…”
Section: Methodsmentioning
confidence: 99%
“…To overcome the introduction of overlaps with sequential chunking, we developed a seeded-chunking method (as shown in Algorithm 2). The seeded-chunking first uses the “farthest point sampling” algorithm [16, 3] to find l seeds for l chunks, such that the seeds are farthest away from each other. That is, starting from a randomly selected seed, the seeds are chosen one at a time, such that each new seed has the largest distance to the set of already selected seeds (i.e.…”
Section: Methodsmentioning
confidence: 99%
“…To overcome the introduction of overlaps with sequential chunking, we developed a novel seeded-chunking method (as shown in Algorithm 2 ). The seeded-chunking first uses the ‘farthest point sampling’ algorithm ( Bronstein et al , 2009 ; Pai et al, 2019 ) to find l seeds for l chunks, such that the seeds are farthest away from each other. That is, starting from a randomly selected seed, the seeds are chosen one at a time, such that each new seed has the largest distance to the set of already selected seeds (i.e.…”
Section: Methodsmentioning
confidence: 99%
“…Finally, it is possible to use a nonlinear parametric model for the embedding, f Θ (x) : R m → R k , where Θ is a set of model parameters, whose number is much smaller than n. For example, [27] finds such a parametric mapping by training a neural network to minimize the following loss:…”
Section: Subspace Methodsmentioning
confidence: 99%
“…(Image © taken from [13] with permission of the authors). (d) A three-dimensional embedding of the camera pose manifold (images of the same object taken with different elevation and azimuth) from a small subset of distances using a machine learning-based nonlinear parametric MDS model [27]. (Image courtesy of the authors) rank order of dissimilarities such as correspondence problems between non-isometric shapes.…”
Section: Applicationsmentioning
confidence: 99%