2020
DOI: 10.1007/s12065-020-00465-x
|View full text |Cite
|
Sign up to set email alerts
|

Neighborhood centroid opposite-based learning Harris Hawks optimization for training neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 17 publications
(5 citation statements)
references
References 44 publications
0
5
0
Order By: Relevance
“…They applied the OHHO in feature selection in breast cancer classification. Fan et al [113] proposed a novel HHO version called NCOHHO, which improves the HHO by two mechanisms: neighborhood centroid and opposite-based learning. In NCOHHO, neighborhood centroid is considered a reference point in generating the opposite particle.…”
Section: Opposite Hhomentioning
confidence: 99%
See 1 more Smart Citation
“…They applied the OHHO in feature selection in breast cancer classification. Fan et al [113] proposed a novel HHO version called NCOHHO, which improves the HHO by two mechanisms: neighborhood centroid and opposite-based learning. In NCOHHO, neighborhood centroid is considered a reference point in generating the opposite particle.…”
Section: Opposite Hhomentioning
confidence: 99%
“…Fan et al [113] used their novel algorithm, NCOHHO, in training a multilayer feedforward neural network using five different datasets. A similar work which combined the HHO with ANN was conducted by Moayedi et al [236] and applied to predict the compression coefficient of soil.…”
Section: Unit Commitment Problemmentioning
confidence: 99%
“…The algorithm has a simple principle, fewer parameters, and better global search capability. Therefore, HHO has been applied in image segmentation [48], neural network training [49], motor control [50] and other fields. However, similar to other swarm intelligence optimization algorithms, HHO has the disadvantages of slow convergence speed, low optimization accuracy, and easily falls into local optimum when solving complex optimization problems.…”
Section: Of 41mentioning
confidence: 99%
“…Feed-forward network [19], Radial basis function (RBF) network [20], Recurrent neural network [21], and convolutional neural network(CNN) [22]. The FNN is the most popular among them due to its straightforward design and effective functionality [23][24][25]. ANN have a high level of performance and simple to implement, and they can capture the hidden relationship between the inputs.…”
Section: Introductionmentioning
confidence: 99%