2020
DOI: 10.1016/j.chb.2018.06.032
|View full text |Cite
|
Sign up to set email alerts
|

Predicting at-risk university students in a virtual learning environment via a machine learning algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
75
0
4

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 136 publications
(79 citation statements)
references
References 31 publications
0
75
0
4
Order By: Relevance
“…Additionally, in a study, at-risk students were predicted by deploying various data mining techniques, including Support Vector Machines (SVM), Naïve Bayes Classifier, Decision Tree, K-Nearest Neighbor and Multi-Layer Perceptron, to identify the best prediction modeling method. Logistic Regression was employed as the baseline model (Chui et al, 2018;Marbouti et al, 2016). Assessment engagement pattern is considered to be another parameter that effectively captures the behavior of students and induces a positive impact on their performance (Hussain et al, 2018;Jung & Lee, 2018).…”
Section: Literature Reviewmentioning
confidence: 99%
“…Additionally, in a study, at-risk students were predicted by deploying various data mining techniques, including Support Vector Machines (SVM), Naïve Bayes Classifier, Decision Tree, K-Nearest Neighbor and Multi-Layer Perceptron, to identify the best prediction modeling method. Logistic Regression was employed as the baseline model (Chui et al, 2018;Marbouti et al, 2016). Assessment engagement pattern is considered to be another parameter that effectively captures the behavior of students and induces a positive impact on their performance (Hussain et al, 2018;Jung & Lee, 2018).…”
Section: Literature Reviewmentioning
confidence: 99%
“…We performed 5-fold cross-validation. In 5-fold cross validation, each fold was iteratively selected to test the model, and the remaining four folds were used to train the model [ 59 , 60 ]. For group normalization, we fix the number of groups to 8.…”
Section: Resultsmentioning
confidence: 99%
“…This improves the feature selection efficiency. Afterward, the features are implemented to the designed optimized ANN for training …”
Section: Simulation Resultsmentioning
confidence: 99%
“…Afterward, the features are implemented to the designed optimized ANN for training. 57 In this study, 85% of the data have been adopted for the training and the remained 15% have been adopted for testing. Simulation results have been compared with different state-of-the-art classification methods such as SVM-based classifier, 58 Genetic-based method, 31 particle swarm optimization algorithm, 59 particle swarm optimization-based ANN (NNPSO), 59 optimized ANN based on hybrid particle swarm optimization algorithm and biogeography-based optimization (MLPPSOBBO), 60 and ANN based on deep learning (CNN).…”
Section: Simulation Resultsmentioning
confidence: 99%