2018
DOI: 10.48550/arxiv.1808.02229
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Grassmannian Learning: Embedding Geometry Awareness in Shallow and Deep Learning

Jiayao Zhang,
Guangxu Zhu,
Robert W. Heath
et al.

Abstract: Modern machine learning algorithms have been adopted in a range of signal-processing applications spanning computer vision, natural language processing, and artificial intelligence. Many relevant problems involve subspace-structured features, orthogonality constrained or low-rank constrained objective functions, or subspace distances. These mathematical characteristics are expressed naturally using the Grassmann manifold. Unfortunately, this fact is not yet explored in many traditional learning algorithms. In … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
18
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 11 publications
(18 citation statements)
references
References 36 publications
0
18
0
Order By: Relevance
“…, p, provides a measure of the distance between two points U A , U B ∈ G(p, n). A number of different distance measures are defined using the principal angles [59].…”
Section: Grassmann Manifold Projectionmentioning
confidence: 99%
“…, p, provides a measure of the distance between two points U A , U B ∈ G(p, n). A number of different distance measures are defined using the principal angles [59].…”
Section: Grassmann Manifold Projectionmentioning
confidence: 99%
“…We will stick to the Gaussian kernel, but use a distance function on the Grassmann manifold spanned by the respective matricizations (see Figure 2). Considering data matrices as points on the Grassmann manifold is the key concept in Grassmannian learning [25,26,27]. Results from this field indicate that using a distance function on the Grassmann manifold tends to improve performance of distancebased machine learning systems for tensor data [26,27].…”
Section: A Tensors and Tensor Kernelsmentioning
confidence: 99%
“…Considering data matrices as points on the Grassmann manifold is the key concept in Grassmannian learning [25,26,27]. Results from this field indicate that using a distance function on the Grassmann manifold tends to improve performance of distancebased machine learning systems for tensor data [26,27].…”
Section: A Tensors and Tensor Kernelsmentioning
confidence: 99%
“…Dimensionality reduction is achieved using the Grassmanian diffusion maps (GDMaps) technique introduced in [50], which identifies a latent representation of the dataset on a lower-dimensional manifold via a two-step procedure. In the first step, high-dimensional data (model solutions) are projected onto an orthonormal matrix manifold called the Grassmann manifold [51,52] that defines the subspace structure of the data. In the second step the diffusion maps (DMaps) method is employed to unfold the underlying nonlinear geometry of the data on the Grassmann manifold onto a diffusion manifold.…”
Section: Introductionmentioning
confidence: 99%