2020 59th IEEE Conference on Decision and Control (CDC) 2020
DOI: 10.1109/cdc42340.2020.9304325
|View full text |Cite
|
Sign up to set email alerts
|

A Stochastic Consensus Method for Nonconvex Optimization on the Stiefel Manifold

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
13
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
5

Relationship

1
9

Authors

Journals

citations
Cited by 17 publications
(13 citation statements)
references
References 26 publications
0
13
0
Order By: Relevance
“…CBO algorithms implement the update rule (2) (or possibly variations thereof, depending on the problem) and compute v α ( ρ N T ) as a guess for the global minimizer. This concept of optimization has been explored and analyzed in several recent works [9,10,14,15,19,25,33], even in a high-dimensional or non-Euclidean setting. As an example for their applicability on high-dimensional problems, we refer for instance to [9], where the authors illustrate the use of CBO for training a shallow neural network classifier for MNIST, or to [15], where (2) and ( 4) are adapted to the sphere S d−1 and achieve near state-of-the-art performance on phase retrieval and robust subspace detection problems.…”
Section: Introductionmentioning
confidence: 99%
“…CBO algorithms implement the update rule (2) (or possibly variations thereof, depending on the problem) and compute v α ( ρ N T ) as a guess for the global minimizer. This concept of optimization has been explored and analyzed in several recent works [9,10,14,15,19,25,33], even in a high-dimensional or non-Euclidean setting. As an example for their applicability on high-dimensional problems, we refer for instance to [9], where the authors illustrate the use of CBO for training a shallow neural network classifier for MNIST, or to [15], where (2) and ( 4) are adapted to the sphere S d−1 and achieve near state-of-the-art performance on phase retrieval and robust subspace detection problems.…”
Section: Introductionmentioning
confidence: 99%
“…which means that X α (µ N t ) is a global best location at time t. It has been proved that CBO can guarantee global convergence under suitable assumptions [16] and it is a powerful and robust method to solve many interesting non-convex high-dimensional optimization problems in machine learning [7,14]. By now, CBO methods have also been generalized to optimization over manifolds [13][14][15]22] and several variants have been explored, which use additionally, for instance, personal best information [30], binary interaction dynamics [3] or connect CBO with Particle Swarm Optimization [8,19]. The readers are referred to [31] for a comprehensive review on the recent developments of the CBO methods.…”
Section: Introductionmentioning
confidence: 99%
“…It has been proved that CBO is a powerful and robust method to solve many interesting non-convex highdimensional optimization problems in machine learning [13]. By now, CBO methods have also been generalized to optimization over manifolds [20][21][22]35]. The objective of the present paper is to complete a theory gap suggested in [25] by providing a rigorous proof of the zero-inertia limit.…”
Section: Introductionmentioning
confidence: 98%