2022
DOI: 10.1007/978-3-030-97020-8_3
|View full text |Cite
|
Sign up to set email alerts
|

Improving Extreme Search with Natural Gradient Descent Using Dirichlet Distribution

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
1
1

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(7 citation statements)
references
References 7 publications
0
5
1
1
Order By: Relevance
“…We verified the capability of the proposed algorithm to converge in the neighborhood of the global minimum in the case of the Rastrigin and Rosenbrock functions, where known algorithms do not achieve the global minimum. Such experiments differ from the experiments in [14,15].…”
Section: Discussioncontrasting
confidence: 63%
See 3 more Smart Citations
“…We verified the capability of the proposed algorithm to converge in the neighborhood of the global minimum in the case of the Rastrigin and Rosenbrock functions, where known algorithms do not achieve the global minimum. Such experiments differ from the experiments in [14,15].…”
Section: Discussioncontrasting
confidence: 63%
“…In [14], which is a continuation of [15], we explored the natural gradient descent based on Dirichlet distribution. In this research, we added and calculated the Fisher information matrix of the generalized Dirichlet distribution.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…But selecting appropriate probability distribution, like Gauss and Dirichelt, we can reduce the variable θ in Fisher infromation matrix, what makes possible to avoid its calculation in every iteration. Such approach is realized in [108] - [111]. The natural gradient descent, based on Fisher-Rao metrics, can replace second order optimization algorithms, due to their rate of convergence and time consumption.…”
Section: Probability Density Functionmentioning
confidence: 99%