2019
DOI: 10.1109/access.2019.2952655
|View full text |Cite
|
Sign up to set email alerts
|

Feature Recognition of Urban Road Traffic Accidents Based on GA-XGBoost in the Context of Big Data

Abstract: The identification of the characteristics of urban road traffic accidents is of great significance for reducing traffic accidents and the corresponding losses. In the context of big data, to accurately understand the characteristics of traffic accidents, the feature set of urban road traffic accidents is proposed, the XGBoost model is used to classify traffic accidents into minor accidents, general accidents, major accidents and serious accidents, and a GA-XGBoost feature recognition model is built. The GA-XGB… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 65 publications
(24 citation statements)
references
References 28 publications
(25 reference statements)
0
23
0
1
Order By: Relevance
“…Furthermore, a previous study has reported positive and promising result by integrating GA as a feature selection method and XGBoost as a classification model. Qu et al (2019) proposed GA-XGBoost to identify the characteristics/features related with traffic accidents [32]. They applied the model into big data of traffic accidents in 7 cities in China.…”
Section: Extreme Gradient Boosting (Xgboost) and Genetic Algorithms (Ga)mentioning
confidence: 99%
See 1 more Smart Citation
“…Furthermore, a previous study has reported positive and promising result by integrating GA as a feature selection method and XGBoost as a classification model. Qu et al (2019) proposed GA-XGBoost to identify the characteristics/features related with traffic accidents [32]. They applied the model into big data of traffic accidents in 7 cities in China.…”
Section: Extreme Gradient Boosting (Xgboost) and Genetic Algorithms (Ga)mentioning
confidence: 99%
“…Genetic algorithm (GA) is a feature selection method that has been used to find the best feature subsets [24,25] and shown significant advantage for improving the performance of emotional stress state detection [26], severe chronic disorders of consciousness prediction [27], children's activities regarding recognition and classification [28], gene encoder [29], hepatitis prediction [30], and coronavirus disease (COVID-19) patient detection [31]. A previous study revealed that a combination of GA and XGBoost has improved the model performance [32]. However, to the best of our knowledge, none of the previous studies have combined GA and XGBoost together to improve the performance of the model, especially for the case of self-care classification.…”
Section: Introductionmentioning
confidence: 99%
“…To avoid the sampling errors, the distribution of all the samples in the training and testing dataset are the same as the original set. Then, the parameters of base XGBoost, KNN and SVM classifier are well-tuned based on parameter adjustment guidelines in [12,21,31]. And the optimal hyperparameters of AL-XGBoost are selected by BO technique, and we set the early stopping rounds as 200 to reduce the optimization time.…”
Section: A Experimental Settingmentioning
confidence: 99%
“…During the training process of XGBoost classifier, the training set size is generally configured large enough to achieve a high classification accuracy [21]. However, the huge size of training set may lead to a time-consuming training process, especially for a CPES with multiple nodes.…”
Section: Introductionmentioning
confidence: 99%
“…LightGBM has a maximum depth parameter, it expands like a tree but prevents overfitting. Gradient boosting, due to its tree structure, is known to be good for tabular data but recently researchers have found it useful in a various applications [55][56][57][58][59][60][61][62][63][64][65][66][67].…”
Section: Why Lightgbm?mentioning
confidence: 99%