2022 31st Conference of Open Innovations Association (FRUCT) 2022
DOI: 10.23919/fruct54823.2022.9770928
|View full text |Cite
|
Sign up to set email alerts
|

AGBoost: Attention-based Modification of Gradient Boosting Machine

Abstract: A new attention-based model for the gradient boosting machine (GBM) called AGBoost (the attention-based gradient boosting) is proposed for solving regression problems. The main idea behind the proposed AGBoost model is to assign attention weights with trainable parameters to iterations of GBM under condition that decision trees are base learners in GBM. Attention weights are determined by applying properties of decision trees and by using the Huber's contamination model which provides an interesting linear dep… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
12
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
2
1

Relationship

3
4

Authors

Journals

citations
Cited by 7 publications
(12 citation statements)
references
References 27 publications
0
12
0
Order By: Relevance
“…The Nadaraya-Watson regression in machine learning. There are several machine learning approaches based on applying the Nadaraya-Watson regression [78][79][80][81][82][83][84]. Properties of the boosting with kernel regression estimates as weak learners were studied in [85].…”
Section: Related Workmentioning
confidence: 99%
“…The Nadaraya-Watson regression in machine learning. There are several machine learning approaches based on applying the Nadaraya-Watson regression [78][79][80][81][82][83][84]. Properties of the boosting with kernel regression estimates as weak learners were studied in [85].…”
Section: Related Workmentioning
confidence: 99%
“…In order to extend a set of attention models, several random forest models in which attention mechanisms were incorporated were proposed in [23,34,35]. A gradient boosting machine to which an attention mechanism was added was presented in [36].…”
Section: Related Workmentioning
confidence: 99%
“…The Nadaraya-Watson regression in machine learning. There are several machine learning approaches based on applying the Nadaraya-Watson regression [74,75,76,77,78,79,80]. Properties of the boosting with kernel regression estimates as weak learners were studied in [81].…”
Section: Related Workmentioning
confidence: 99%