2023
DOI: 10.1080/19439962.2023.2204843
|View full text |Cite
|
Sign up to set email alerts
|

An aggressive driving state recognition model using EEG based on stacking ensemble learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 39 publications
0
3
0
Order By: Relevance
“…where wi is the ith training instance weight and n is the number of training instances. (v) Stacking ensemble method: Stacking is a method of integrating predictions from various machine learning models into the same dataset, such as bagging and boosting [33]. The stacking technique's architecture consists of two or more models, known as base models or level-0, and meta-models that combine the predictions of the base models, known as level-1 models [34].…”
Section: Building the Two-layer Ensemble Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…where wi is the ith training instance weight and n is the number of training instances. (v) Stacking ensemble method: Stacking is a method of integrating predictions from various machine learning models into the same dataset, such as bagging and boosting [33]. The stacking technique's architecture consists of two or more models, known as base models or level-0, and meta-models that combine the predictions of the base models, known as level-1 models [34].…”
Section: Building the Two-layer Ensemble Modelmentioning
confidence: 99%
“…where w i is the ith training instance weight and n is the number of training instances. (v) Stacking ensemble method: Stacking is a method of integrating predictions from various machine learning models into the same dataset, such as bagging and boosting [33].…”
Section: Building the Two-layer Ensemble Modelmentioning
confidence: 99%
“…(v) Stacking ensemble method: Stacking is a method of integrating predictions from various machine learning models on the same dataset, such as bagging and boosting [33]. The stacking technique's architecture consists of two or more models, known as base models or level-0, and metamodels that combine the predictions of the base models, known as level-1 models [34].…”
Section: Building the Two-layer Ensemble Modelmentioning
confidence: 99%