2022
DOI: 10.14569/ijacsa.2022.0130351
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Hyperparameter Optimization and Ensemble Learning for Machine Learning Models on Software Effort Estimation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 13 publications
(7 citation statements)
references
References 0 publications
0
7
0
Order By: Relevance
“…Marco et al [ 53 ] tried to use AdaBoost ensemble learning method and RFR for different data sets. Several machine learning algorithms, for different data sets MLP, SVR, CART, kNN, and RFR were applied.…”
Section: Discussionmentioning
confidence: 99%
“…Marco et al [ 53 ] tried to use AdaBoost ensemble learning method and RFR for different data sets. Several machine learning algorithms, for different data sets MLP, SVR, CART, kNN, and RFR were applied.…”
Section: Discussionmentioning
confidence: 99%
“…Variational autoencoders (VAEs) are generative models that learn to simulate the latent representation of data [37]. It is used to improve the quality of the generated outputs [38].…”
Section: Phase 3: Build the Variational Autoencoder (Vae) Modelmentioning
confidence: 99%
“…Ensembling is obtained by combining various models. Bagging, boosting, voting, etc., are some of the ensemble approaches [35]. Here we aggregated predictions of various models, i.e., averaged the output predictions of all models and produced one model closer to the actual effort than any individual model.…”
Section: E Votingmentioning
confidence: 99%