2011
DOI: 10.1007/s12648-011-0162-z
|View full text |Cite
|
Sign up to set email alerts
|

Total cross section prediction of the collisions of positrons and electrons with alkali atoms using Gradient Tree Boosting

Abstract: A relatively new computational technique, namely gradient tree boosting (GTB), is presented for modeling the total cross sections of the scattering of positrons and electrons by alkali atoms in the low and intermediate energy regions. The calculations have been performed in the framework of gradient tree boosting (GTB). The GTB has been running based on the experimental data of the total collisional cross sections to produce the total cross sections for each alkali atom as a function of the incident energy of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
1

Year Published

2013
2013
2024
2024

Publication Types

Select...
10

Relationship

0
10

Authors

Journals

citations
Cited by 18 publications
(9 citation statements)
references
References 38 publications
0
8
1
Order By: Relevance
“…Because Eq ( 2 ) is difficult to compute, Friedman [ 29 ] proposed a two-step procedure; first, the a m for regression tree is decided as follows: where is the gradient and is decided by Secondly, the optimal value of β m is determined as: when the m th regression tree using the a m has L m leaf nodes, the regression is given by where 1(•) is a Boolean function that outputs 1 in case the argument of the function is true. At each iteration m , a regression tree partitions the x space into L disjoint regions [ 30 ]. is defined as the mean of training data that belongs to the lm th region.…”
Section: Introductionmentioning
confidence: 99%
“…Because Eq ( 2 ) is difficult to compute, Friedman [ 29 ] proposed a two-step procedure; first, the a m for regression tree is decided as follows: where is the gradient and is decided by Secondly, the optimal value of β m is determined as: when the m th regression tree using the a m has L m leaf nodes, the regression is given by where 1(•) is a Boolean function that outputs 1 in case the argument of the function is true. At each iteration m , a regression tree partitions the x space into L disjoint regions [ 30 ]. is defined as the mean of training data that belongs to the lm th region.…”
Section: Introductionmentioning
confidence: 99%
“…In our previous works [24]- [27], our group studied the applications of artificial intelligence and the evolutionary computation techniques such as neural network, adaptive fuzzy inference system, genetic programming, genetic algorithm, hybrid technique model and many others to solve many complex (nonlinear) problems in high energy physics and showed best fitting with the corresponding experimental data in comparison with the conventional techniques.…”
Section: Introductionmentioning
confidence: 99%
“…The third limitation is that the number of classifiers or regression models that are applied in the studies that use both signals is limited. There exist various types of recent machine learning classifiers such as sparse representation classifiers (SRC), multi adaptive sparse representation classifiers, XGBoost classifiers [46], and many others. Moreover, there exist different regression models rather than the linear and the logistic and these regression models are polynomial, quantile, and lasso models.…”
Section: Discussion and Limitationsmentioning
confidence: 99%