2022
DOI: 10.1016/j.eswa.2022.117943
|View full text |Cite
|
Sign up to set email alerts
|

Research on prediction of multi-class theft crimes by an optimized decomposition and fusion method based on XGBoost

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 35 publications
(9 citation statements)
references
References 9 publications
0
9
0
Order By: Relevance
“…Random forest is an ensemble machine learning method using the Classification and Regression Trees (CART) theory to independently construct each tree based on a randomly chosen subset of input variable vectors sampled, and then with the same distribution for all trees in the forests (Breiman, 2001; Jing et al., 2017). XGBoost technique is a boosting ensemble machine learning algorithm based on decision trees in the process of learning the residuals between the predicted value and the true value of the current tree to generate the next tree until accumulatively achieving the final learning results of multiple trees as the prediction results (Yan et al., 2022).…”
Section: Methodsmentioning
confidence: 99%
“…Random forest is an ensemble machine learning method using the Classification and Regression Trees (CART) theory to independently construct each tree based on a randomly chosen subset of input variable vectors sampled, and then with the same distribution for all trees in the forests (Breiman, 2001; Jing et al., 2017). XGBoost technique is a boosting ensemble machine learning algorithm based on decision trees in the process of learning the residuals between the predicted value and the true value of the current tree to generate the next tree until accumulatively achieving the final learning results of multiple trees as the prediction results (Yan et al., 2022).…”
Section: Methodsmentioning
confidence: 99%
“…While it can be used for both classification and regression problems, all of the formulas and examples in this story refer to the algorithm's use for classification. XGBoost enhances the basic GBM framework through system optimization and algorithm improvements, following [68]- [70]: (1) parallelized tree-building where XGBoost has a sequential tree-building approach using implementations in parallel [71], (2) tree pruning where XGBoost grows the tree to max depth and then prunes backward until the increase in loss function is below a threshold [72], [73], (3) cache awareness and out-of-core computing where XGBoost designed to reduce computation time efficiently and allocate memory resources optimally [74], [75], (4) regularization is a technique used to avoid overfitting linear models and treebased models that limit, adjust or shrink the estimated coefficients towards zero [69], (5) handling missing values, and (6) built-in cross-validation whereas XGBoost comes with this method at every iteration, eliminating the need to explicitly program this seek and to specify the exact number of boosting iterations required in a single run [76]- [78].…”
Section: Extreme Gradient Boosting (Xgboost)mentioning
confidence: 99%
“…To solve the problem of the high cost and need for energy efficiency management between electric vehicles and charging systems, Arumugam et al 17 designed a hybrid strategy based on border collie optimization (BCO) and a gradient boost decision tree (GBDT) for energy consumption management between distribution systems and electric vehicles. To solve the problem of frequent theft cases, Yan et al 18 proposed an optimized decomposition fusion method combined with XGBoost to predict theft cases, which could effectively reduce the occurrence of thefts. Liang et al 19 proposed an atmospheric parameter evaluation method based on LightGBM, which provided a new method for atmospheric research.…”
Section: Introductionmentioning
confidence: 99%