2021
DOI: 10.3390/e23070812
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Method to Determine Basic Probability Assignment Based on Adaboost and Its Application in Classification

Abstract: In the framework of evidence theory, one of the open and crucial issues is how to determine the basic probability assignment (BPA), which is directly related to whether the decision result is correct. This paper proposes a novel method for obtaining BPA based on Adaboost. The method uses training data to generate multiple strong classifiers for each attribute model, which is used to determine the BPA of the singleton proposition since the weights of classification provide necessary information for fundamental … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 37 publications
0
6
0
Order By: Relevance
“…A detailed introduction to these methods of constructing multiclass classification models is as follows: AdaBo o st : AdaBoost is short for adaptive boosting and is based on the boosting algorithm . The structure of the model is simple, and the classification accuracy is high because it concentrates on the misclassified samples and finally constructs a well-performing model Bagging : Bagging is short for bootstrap aggregation, and it can improve accuracy by reducing variances .…”
Section: Machine Learning Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…A detailed introduction to these methods of constructing multiclass classification models is as follows: AdaBo o st : AdaBoost is short for adaptive boosting and is based on the boosting algorithm . The structure of the model is simple, and the classification accuracy is high because it concentrates on the misclassified samples and finally constructs a well-performing model Bagging : Bagging is short for bootstrap aggregation, and it can improve accuracy by reducing variances .…”
Section: Machine Learning Methodsmentioning
confidence: 99%
“…AdaBo o st : AdaBoost is short for adaptive boosting and is based on the boosting algorithm . The structure of the model is simple, and the classification accuracy is high because it concentrates on the misclassified samples and finally constructs a well-performing model …”
Section: Machine Learning Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Wang and Tang [13] constructed a Gaussian distribution based on the mean and variance of the training samples in the data set, and generated the BBA by calculating the function value of the test sample on the Gaussian distribution. Fu et al [14] constructed multiple strong classifiers through the Adaboost algorithm and recorded the corresponding weights for determining the BBA of singleton propositions and obtained the BBA of composite propositions by quantifying the cross-sectional area of singleton proposition intersections. In order to quantify the uncertainty of the BBA function, Pan et al [15] proposed a new belief entropy based on likelihood transformation and weighted Hartley entropy.…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, it can be applied in many fields including pattern recognition [9,10], fault diagnosis [11][12][13][14][15], risk assessment [16,17]. In recent years, D-S evidence theory has received much attention as a promising theory for the handing of uncertain information in multi-sensor fusion system [18][19][20][21][22][23][24][25][26][27].…”
Section: Introductionmentioning
confidence: 99%