2016 23rd International Conference on Pattern Recognition (ICPR) 2016
DOI: 10.1109/icpr.2016.7899634
|View full text |Cite
|
Sign up to set email alerts
|

Loss factors for learning Boosting ensembles from imbalanced data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2018
2018
2019
2019

Publication Types

Select...
2
2
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 20 publications
0
4
0
Order By: Relevance
“…E and the number of negative samples in each partition N e varies and depends on the partitioning method and the data distribution. In the case of random under-sampling without re-placement E is preselected and N e takes a fixed random value After that, the loss factor is calculated using the method proposed in (Soleymani et al, 2016b…”
Section: Progressive Boosting For Learning Ensembles From Im-mentioning
confidence: 99%
See 3 more Smart Citations
“…E and the number of negative samples in each partition N e varies and depends on the partitioning method and the data distribution. In the case of random under-sampling without re-placement E is preselected and N e takes a fixed random value After that, the loss factor is calculated using the method proposed in (Soleymani et al, 2016b…”
Section: Progressive Boosting For Learning Ensembles From Im-mentioning
confidence: 99%
“…The drawback of these cost-sensitive techniques is that they rely on the suitable se-lection of cost factors which is often estimated by searching a range of possible values. In contrast, cost-free techniques modify learning algorithms by enhancing loss factor calcula-85 tion without considering cost factors (Joshi et al, 2001;Kim et al, 2015;Soleymani et al, 2016b).…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations