2019
DOI: 10.1016/j.acalib.2019.02.013
|View full text |Cite
|
Sign up to set email alerts
|

Application of adaptive boosting (AdaBoost) in demand-driven acquisition (DDA) prediction: A machine-learning approach

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
25
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 48 publications
(25 citation statements)
references
References 33 publications
0
25
0
Order By: Relevance
“…Zarandi et al [4] used the AdaBoost with Support Vector Regressor in modeling minimum miscibility pressure of pure/impure CO2-crude oil systems and concluded that the model gives the most acceptable and accurate result with a very satisfactory error distribution. AdaBoost was also implemented by Walker & Jiang [23] in the demand-driven acquisition of library materials and compared it to the logistic regression model. The authors concluded that AdaBoost performs better with an accuracy of 82%.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Zarandi et al [4] used the AdaBoost with Support Vector Regressor in modeling minimum miscibility pressure of pure/impure CO2-crude oil systems and concluded that the model gives the most acceptable and accurate result with a very satisfactory error distribution. AdaBoost was also implemented by Walker & Jiang [23] in the demand-driven acquisition of library materials and compared it to the logistic regression model. The authors concluded that AdaBoost performs better with an accuracy of 82%.…”
Section: Related Workmentioning
confidence: 99%
“…To perform the classification of risk level in the data, the following steps were presented in the pseudocode. The algorithm searches for the candidate support vectors represented by S and assumes that SV occupies a space where the parameters of the linear features of the hyper-plane are stored [23].…”
Section: Support Vector Machinementioning
confidence: 99%
“…It is one of the most significant developments in Machine Learning [50,51]. AdaBoost [52] was the first, widely used implementation of boosting and is still favoured for its accuracy, ease of deployment and fast training time [53,54,55]. It uses shallow decision trees as the weak classifiers.…”
Section: Multi-class Adaboostmentioning
confidence: 99%
“…It is one of the most significant developments in Machine Learning [46,47]. AdaBoost [48] was the first, widely used implementation of boosting and is still favoured for its accuracy, ease of deployment and fast training time [49,50,51]. It uses shallow decision trees as the weak classifiers.…”
Section: Multi-class Adaboostmentioning
confidence: 99%