2018
DOI: 10.1109/lgrs.2018.2803259
|View full text |Cite
|
Sign up to set email alerts
|

Very High Resolution Object-Based Land Use–Land Cover Urban Classification Using Extreme Gradient Boosting

Abstract: In this letter the recently developed Extreme Gradient Boosting (Xgboost) classifier is implemented in a veryhigh-resolution (VHR) object-based urban Land Use-Land Cover application. In detail, we investigated the sensitivity of Xgboost to various sample sizes, as well as to feature selection (FS) by applying a standard technique, Correlation Based Feature Selection. We compared Xgboost with benchmark classifiers such as Random Forest (RF) and Support Vector Machines (SVM). The methods are applied to VHR image… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

5
108
0
2

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 216 publications
(115 citation statements)
references
References 25 publications
5
108
0
2
Order By: Relevance
“…We first implemented random forest and gradient boosting via XGBoost [71], two ensemble machine learning methods that have proven effective with the semantic segmentation of images across a number of disciplines including bioimage analysis [72][73][74][75][76][77][78][79] and remote sensing [80][81][82][83][84][85]. Unlike neural networks and deep learning methods, models trained with these algorithms benefit from simplicity and a relatively high level of interpretability [86][87][88].…”
Section: Random Forest and Gradient Boostingmentioning
confidence: 99%
“…We first implemented random forest and gradient boosting via XGBoost [71], two ensemble machine learning methods that have proven effective with the semantic segmentation of images across a number of disciplines including bioimage analysis [72][73][74][75][76][77][78][79] and remote sensing [80][81][82][83][84][85]. Unlike neural networks and deep learning methods, models trained with these algorithms benefit from simplicity and a relatively high level of interpretability [86][87][88].…”
Section: Random Forest and Gradient Boostingmentioning
confidence: 99%
“…Recently, new ensemble learning algorithms, such as CCF (2015) and XgBoost (2016), have been explored and tested for the classification of remotely sensed images in a number of studies [26][27][28][29]. A relatively new ensemble learning algorithm, LightGBM (2017), has been introduced to the machine learning community, and has been received with great interest because it has outperformed the existing boosting frameworks in recent machine learning and data science competitions, especially in regards to complex datasets [43].…”
Section: Light Gradient Boosting Machinementioning
confidence: 99%
“…Recently, new ensemble learning algorithms, such as canonical correlation forest (CCF) (2015), extreme gradient boosting (XgBoost) (2016), and Light Gradient Boosting Machine (LightGBM) (2017), have been introduced to the machine learning community [23][24][25]. A very limited number of papers have been published regarding CCF [26,27] and XgBoost [28][29][30] for classification purposes in remote sensing; however, no study has been published yet using the recently launched LightGBM, which is a highly efficient gradient boosting decision tree that was developed by Microsoft Research in the field of remote sensing for classification purposes. Only in one paper by Liu, Ji, and Buchroithner [31] has LightGBM been tested, in this case for soil property retrieval by combining partial least squares.…”
Section: Introductionmentioning
confidence: 99%
“…Stefanos Georganos et.al [2] implemented Extreme Gradient Boosting (XGBoost). For Very High Resolution (VHR) object based estimation of urban land cover.…”
Section: Related Workmentioning
confidence: 99%