2020
DOI: 10.1613/jair.1.12192
|View full text |Cite
|
Sign up to set email alerts
|

Gradient-based Learning Methods Extended to Smooth Manifolds Applied to Automated Clustering

Abstract: Grassmann manifold based sparse spectral clustering is a classification technique that  consists in learning a latent representation of data, formed by a subspace basis, which  is sparse. In order to learn a latent representation, spectral clustering is formulated in  terms of a loss minimization problem over a smooth manifold known as Grassmannian.  Such minimization problem cannot be tackled by one of traditional gradient-based learning  algorithms, which are only suitable to perform optimization in absence … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 31 publications
0
1
0
Order By: Relevance
“…The second-order system (27) represents an improvement to the first-order optimization method (28) and traces back to Polyak's heavy-ball method [40], which, however, came as a discrete-time iterative algorithm. The heavy-ball method (later known as Nesterov accelerated gradient optimization method) may indeed be studied through an associated continuous-time differential equation, as shown in, e.g., [41][42][43]. The dynamical system (27) would constitute a natural generalization of the second-order differential equation associated with Nesterov's iteration, under the proviso that one allows the friction coefficient µ to decrease in time as fast as 1 t .…”
Section: Non-conservative Forcing Terms and Reduced-order Systemsmentioning
confidence: 99%
“…The second-order system (27) represents an improvement to the first-order optimization method (28) and traces back to Polyak's heavy-ball method [40], which, however, came as a discrete-time iterative algorithm. The heavy-ball method (later known as Nesterov accelerated gradient optimization method) may indeed be studied through an associated continuous-time differential equation, as shown in, e.g., [41][42][43]. The dynamical system (27) would constitute a natural generalization of the second-order differential equation associated with Nesterov's iteration, under the proviso that one allows the friction coefficient µ to decrease in time as fast as 1 t .…”
Section: Non-conservative Forcing Terms and Reduced-order Systemsmentioning
confidence: 99%