2024
DOI: 10.1017/s095679252400007x
|View full text |Cite
|
Sign up to set email alerts
|

Consensus-based optimisation with truncated noise

Massimo Fornasier,
Peter Richtárik,
Konstantin Riedl
et al.

Abstract: Consensus-based optimisation (CBO) is a versatile multi-particle metaheuristic optimisation method suitable for performing non-convex and non-smooth global optimisations in high dimensions. It has proven effective in various applications while at the same time being amenable to a theoretical convergence analysis. In this paper, we explore a variant of CBO, which incorporates truncated noise in order to enhance the well-behavedness of the statistics of the law of the dynamics. By introducing this additional tru… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 42 publications
0
1
0
Order By: Relevance
“…However, unlike many such methods, CBX has been designed to be compatible with rigorous convergence analysis at the mean-field level (the infinite-particle limit, see Huang & Qiu, 2022). Many convergence results have been shown, whether in the original formulation (Carrillo et al, 2018;Fornasier, Klock, et al, 2021), for CBO with anisotropic noise (Carrillo et al, 2021;Fornasier et al, 2022), with memory effects (Riedl, 2023), with truncated noise (Fornasier et al, 2024), for polarised CBO (Bungert et al, 2024), and PSO (Huang et al, 2023). The relation between CBO and stochastic gradient descent has been recently established by Riedl et al (2023), which suggests a previously unknown yet fundamental connection between derivative-free and gradient-based approaches.…”
Section: Mathematical Backgroundmentioning
confidence: 99%
“…However, unlike many such methods, CBX has been designed to be compatible with rigorous convergence analysis at the mean-field level (the infinite-particle limit, see Huang & Qiu, 2022). Many convergence results have been shown, whether in the original formulation (Carrillo et al, 2018;Fornasier, Klock, et al, 2021), for CBO with anisotropic noise (Carrillo et al, 2021;Fornasier et al, 2022), with memory effects (Riedl, 2023), with truncated noise (Fornasier et al, 2024), for polarised CBO (Bungert et al, 2024), and PSO (Huang et al, 2023). The relation between CBO and stochastic gradient descent has been recently established by Riedl et al (2023), which suggests a previously unknown yet fundamental connection between derivative-free and gradient-based approaches.…”
Section: Mathematical Backgroundmentioning
confidence: 99%