2013
DOI: 10.1016/j.knosys.2013.02.008
|View full text |Cite
|
Sign up to set email alerts
|

Stacked ensemble coupled with feature selection for biomedical entity extraction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
19
0

Year Published

2014
2014
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 39 publications
(19 citation statements)
references
References 16 publications
0
19
0
Order By: Relevance
“…3.3.3 (R Foundation for Statistical Computing, Vienna, Austria) to automatically generate ML model structures based on seven ML algorithms: XG Boost, deep learning (DL), distributed random forest (DRF), generalised linear model (GLM), gradient boosting machine (GBM), extremely randomized trees, and stacked ensemble (SE). SE is an ensemble method that makes final predictions by incorporating decisions made from different models trained from other algorithms 28 .…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…3.3.3 (R Foundation for Statistical Computing, Vienna, Austria) to automatically generate ML model structures based on seven ML algorithms: XG Boost, deep learning (DL), distributed random forest (DRF), generalised linear model (GLM), gradient boosting machine (GBM), extremely randomized trees, and stacked ensemble (SE). SE is an ensemble method that makes final predictions by incorporating decisions made from different models trained from other algorithms 28 .…”
Section: Methodsmentioning
confidence: 99%
“…www.nature.com/scientificreports/ erate ML model structures based on seven ML algorithms: XG Boost, deep learning (DL), distributed random forest (DRF), generalised linear model (GLM), gradient boosting machine (GBM), extremely randomized trees, and stacked ensemble (SE). SE is an ensemble method that makes final predictions by incorporating decisions made from different models trained from other algorithms 28 . For the attributes, for LR model we used logit link function and iteratively reweighted least squares (IWLS) estimation which is the default algorithm in glm() function in stats v3.6.2 package.…”
Section: Patientsmentioning
confidence: 99%
“…Other popular ensemble methods include AdaBoost, Random Forest and Bagging [62]. Stacked ensembles are the simplest yet one of the most effective ensemble methods, widely used in a variety of applications [63,64]. Stacked ensemble acts as our baseline ensemble, compared with our proposed weighted ensemble based on differential evolution, a meta-heuristic weight optimization method.…”
Section: Ensemblesmentioning
confidence: 99%
“…We trained a CRF model on 100% of the source domain and stacked it with another CRF trained on 90% of the folded in target domain data. Stacking employs a meta-classifier and is a popular method for constructing high performance ensembles of classifiers (Ekbal and Saha, 2013). In this case we collected the output labels from the source domain-trained CRF on target sentences and added them as features for the target domain trained CRF.…”
Section: M4: Stackmentioning
confidence: 99%