2022
DOI: 10.1177/03611981221074370
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Histogram-Based Gradient Boosting Approach for Accident Severity Prediction With Multisource Data

Abstract: Many people lose their lives in road accidents because they do not receive timely treatment after the accident from emergency medical services; providing timely emergency services can decrease the fatality rate as well as the severity of accidents. In this study, we predicted the severity of car accidents for use by trauma centers and hospitals for emergency response management. The predictions of our model could be used to decide whether an ambulance unit should be dispatched to the crash site or not. This st… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 25 publications
(6 citation statements)
references
References 78 publications
0
6
0
Order By: Relevance
“…The boosting family of algorithms is less susceptible to outliers, can handle a variety of categorical and numerical variables without considerable preprocessing, and is scalable with efficient and parallelized implementations for diverse classification problems [ 58 ]. The HISTGradientBoosting [ 59 ], LightGBM, Random Forest [ 60 ], DART [ 61 ], and XGBoost [ 62 ] algorithms were compared to find the best ML soft sensor that was most appropriate for this case study. A box plot, which is shown in Figure 4 , was used to reflect the probable outliers, central tendency, and dispersion for the HISTGradientBoosting, LightGBM, Random Forest, DART, and XGBoost algorithms in predicting changeovers in the presence of Gaussian noise.…”
Section: Materials and Methodsmentioning
confidence: 99%
“…The boosting family of algorithms is less susceptible to outliers, can handle a variety of categorical and numerical variables without considerable preprocessing, and is scalable with efficient and parallelized implementations for diverse classification problems [ 58 ]. The HISTGradientBoosting [ 59 ], LightGBM, Random Forest [ 60 ], DART [ 61 ], and XGBoost [ 62 ] algorithms were compared to find the best ML soft sensor that was most appropriate for this case study. A box plot, which is shown in Figure 4 , was used to reflect the probable outliers, central tendency, and dispersion for the HISTGradientBoosting, LightGBM, Random Forest, DART, and XGBoost algorithms in predicting changeovers in the presence of Gaussian noise.…”
Section: Materials and Methodsmentioning
confidence: 99%
“…To maximize data utilization, we used latitude, longitude, month, ambient light situation, hour, and other weather-related variables of each error to impute missing values in the weather conditions. Using histogram-based gradient boosting decision trees (HistGBDT) due to its ability to handle missing values in features [29][30][31] , with optimized performance by grid search, we achieved an accuracy of 93% and a weighted average F1-score of 92% in imputing missing values. Due to the large dataset and the study's objective of assessing all possible variables, we restricted our analysis to data without any missing values, resulting in a nal dataset comprising 619,988 rows.…”
Section: Fatiguementioning
confidence: 99%
“…29,618 (1.87% [ 95% CI: 1.85-1.89]) occurred in identi ed accident hotspots. Examining error types revealed that the majority of errors were associated with harsh turning (57.02% [56.95-57.10]) and fatigue (29.51%[29.44-29.58]).…”
mentioning
confidence: 99%
“…Histogram-Based Gradient Boosting uses histograms to speed up training [24]. It optimizes the same loss function as traditional gradient boosting but employs histogram-based techniques for better efficiency.…”
Section: ) Histogram-based Gradient Boostingmentioning
confidence: 99%