2021
DOI: 10.1109/joe.2020.2989853
|View full text |Cite
|
Sign up to set email alerts
|

Acoustic Seabed Classification Based on Multibeam Echosounder Backscatter Data Using the PSO-BP-AdaBoost Algorithm: A Case Study From Jiaozhou Bay, China

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 36 publications
(15 citation statements)
references
References 45 publications
0
15
0
Order By: Relevance
“…Finally, the best weak classifier for each iteration is selected to construct a strong classifier. However, AdaBoost combines weak classifiers to construct a strong classifier [ 54 ]. The weights of each weak classifier are not equal, and the stronger classifier will be assigned the high weight [ 55 ].…”
Section: Ai-based Discriminant Algorithmsmentioning
confidence: 99%
See 2 more Smart Citations
“…Finally, the best weak classifier for each iteration is selected to construct a strong classifier. However, AdaBoost combines weak classifiers to construct a strong classifier [ 54 ]. The weights of each weak classifier are not equal, and the stronger classifier will be assigned the high weight [ 55 ].…”
Section: Ai-based Discriminant Algorithmsmentioning
confidence: 99%
“…The weights of each weak classifier are not equal, and the stronger classifier will be assigned the high weight [ 55 ]. Specifically, the weighted error of the k -th weak classifier is written as [ 54 ] where w indicates the output weight. The weight coefficient of the k -th is defined as [ 54 ] …”
Section: Ai-based Discriminant Algorithmsmentioning
confidence: 99%
See 1 more Smart Citation
“…22 The Adaboost algorithm can make good use of different algorithms as its weak classifiers, through repeated learning, while constantly changing the probability distribution of the training sample data and the proportion of the weak classifier in the strong classifier, the final output is determined. The key core of the Adaboost algorithm lies in the selection of weak classifiers, choosing a suitable weak classifier can greatly improve the accuracy of Adaboost classification, 23,24 and the error rate can be adjusted adaptively according to the feedback results of the weak classifier, improving classification accuracy and high execution efficiency, so this algorithm is widely used by people. 25 Its model structure is shown in Figure 2.…”
Section: Improved Bp-adaboost Algorithmmentioning
confidence: 99%
“…Therefore, in the LSSVM model based on the radial basis function (RBF), the kernel width  of the radial basis function (RBF) and the penalty factor C are the main parameters that affect its performance. In order to E3S Web of Conferences 185, 01051 (2020) ICEEB 2020 http://doi.org/10.1051/e3sconf/202018501051 improve the prediction accuracy, the above-mentioned particle swarm optimization is used The algorithm optimizes these two parameters [10].…”
Section: Least Squares Support Vector Machine (Lssvm) Support Vectormentioning
confidence: 99%