2019 Moratuwa Engineering Research Conference (MERCon) 2019
DOI: 10.1109/mercon.2019.8818915
|View full text |Cite
|
Sign up to set email alerts
|

Taxi Trip Travel Time Prediction with Isolated XGBoost Regression

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0
1

Year Published

2021
2021
2023
2023

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 29 publications
(18 citation statements)
references
References 16 publications
0
15
0
1
Order By: Relevance
“…It was applied in solving machine learning challenges in different application domains. XGBoost is an algorithm that has an ensemble of DTs and is robust to outliers, and therefore XGBoost algorithm is thought to have a good performance in time series related predictions [31]. Boosting with another ensemble tree-based method, which was first proposed by Kearns in 1988 [32] Compared with the bagging method which has a parallel process (i.e., each DT runs independently and then aggregates their outputs at the end), the boosting method behaves more like a gradual process that improves the prediction through developing multiple models in sequence by emphasizing these training cases that are difficult to estimate [30].…”
Section: Extreme Gradient Boostingmentioning
confidence: 99%
“…It was applied in solving machine learning challenges in different application domains. XGBoost is an algorithm that has an ensemble of DTs and is robust to outliers, and therefore XGBoost algorithm is thought to have a good performance in time series related predictions [31]. Boosting with another ensemble tree-based method, which was first proposed by Kearns in 1988 [32] Compared with the bagging method which has a parallel process (i.e., each DT runs independently and then aggregates their outputs at the end), the boosting method behaves more like a gradual process that improves the prediction through developing multiple models in sequence by emphasizing these training cases that are difficult to estimate [30].…”
Section: Extreme Gradient Boostingmentioning
confidence: 99%
“…The experimental part was carried out on a real-world dataset. Specifically, we trained the model on taxi trip recordings across New York City territories, a popular open data source already utilized in a variety of mobility-related studies (Jindal et al, 2017;Kankanamge et al, 2019;Wang & Ross, 2019;Xu et al, 2021).…”
Section: Datamentioning
confidence: 99%
“…Regardless of the data type, it is well known to provide better solutions than other ML algorithms, because of its rapidity, efficiency, and scalability [52,53]. It has been the focus of research in various fields [54][55][56]. In particular, in mechanical machining [52,57,58], XGB is a good choice to predict tool wear and surface roughness.…”
Section: Extreme Gradient Boosting Regression (Xgb)mentioning
confidence: 99%