2018
DOI: 10.3389/fams.2018.00012
|View full text |Cite
|
Sign up to set email alerts
|

Deep Nets for Local Manifold Learning

Abstract: The problem of extending a function f defined on a training data C on an unknown manifold X to the entire manifold and a tubular neighborhood of this manifold is considered in this paper. For X embedded in a high dimensional ambient Euclidean space R D , a deep learning algorithm is developed for finding a local coordinate system for the manifold without eigen-decomposition, which reduces the problem to the classical problem of function approximation on a low dimensional cube. Deep nets (or multilayered neural… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
22
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 39 publications
(22 citation statements)
references
References 55 publications
0
22
0
Order By: Relevance
“…On the other hand, it was proposed in [1,12] with practical arguments, that deep nets can tackle data in highly-curved manifolds, while any shallow net fails. These arguments were theoretically verified in [9,41], with the implication that adding hidden layers to shallow nets should enable the neural networks to have the capability of processing massive data in a high-dimensional space from samples in lower dimensional manifolds. More precisely, it follows from [13,41] that for a lower d-dimensional connected and compact C ∞ Riemannian submanifold X ⊆ [−1, 1] D (without boundary), isometrically embedded in R D and endowed with the geodesic distance d G , there exists some δ > 0, such that for any x, x ′ ∈ X , with d G (x, x ′ ) < δ,…”
Section: Localized Approximation and Localized Manifold Learningmentioning
confidence: 91%
See 3 more Smart Citations
“…On the other hand, it was proposed in [1,12] with practical arguments, that deep nets can tackle data in highly-curved manifolds, while any shallow net fails. These arguments were theoretically verified in [9,41], with the implication that adding hidden layers to shallow nets should enable the neural networks to have the capability of processing massive data in a high-dimensional space from samples in lower dimensional manifolds. More precisely, it follows from [13,41] that for a lower d-dimensional connected and compact C ∞ Riemannian submanifold X ⊆ [−1, 1] D (without boundary), isometrically embedded in R D and endowed with the geodesic distance d G , there exists some δ > 0, such that for any x, x ′ ∈ X , with d G (x, x ′ ) < δ,…”
Section: Localized Approximation and Localized Manifold Learningmentioning
confidence: 91%
“…In this section, we present a construction of deep neural networks (called deep nets, for simplicity) with three hidden layers to realize certain deep learning algorithms, by applying the mathematical tools of localized approximation in [7], local manifold learning in [9], and local average arguments in [19]. Throughout this paper, we will consider only two activation functions: the Heaviside functionσ 0 and the square-rectifier σ 2 , where the standard notation t + = max{0, t} is used to define σ n (t) = t n + = (t + ) n , for any non-negative integer n.…”
Section: Construction Of Deep Netsmentioning
confidence: 99%
See 2 more Smart Citations
“…In this era of big data, data-sets of massive size and with various features are routinely acquired, creating a crucial challenge to machine learning in the design of learning strategies for data management, particularly in realization of certain data features. Deep learning [11] is a state-of-the-art approach for the purpose of realizing such features, including localized position information [3,5], geometric structures of data-sets [4,29], and data sparsity [17,18]. For this and other reasons, deep learning has recently received much attention, and has been successful in various application domains [8], such as computer vision, speech recognition, image classification, fingerprint recognition and earthquake forecasting.…”
Section: Introductionmentioning
confidence: 99%