2014
DOI: 10.5120/17440-8211
|View full text |Cite
|
Sign up to set email alerts
|

Prediction of Student's Performance based on Incremental Learning

Abstract: It is necessary to use Student dataset in order to analyze student's performance for future improvements in study methods and overall curricular. Incremental learning methods are becoming popular nowadays since amount of data and information is rising day by day. There is need to update classifier in order to scale up learning to manage more training data. Incremental learning technique is a way in which data is processed in chunks and the results are merged so as to possess less memory. For this reason, in th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
5
2
1

Relationship

3
5

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 33 publications
0
6
0
Order By: Relevance
“…Online incremental learning is an Artificial intelligence (AI) technique that refers to the circumstance of a permanent online adaptation of the analytical model according to the constantly received data flow over time [4,5,13]. This technique has been used to fulfill adequately various learning analytics objectives among which predicting students performance [1,6,7], image classification [10,14] and text classification [11]. Most of the existing works focus either on solving the problem of catastrophic forgetting and concept drift, or on comparing incremental online algorithms.…”
Section: Related Workmentioning
confidence: 99%
“…Online incremental learning is an Artificial intelligence (AI) technique that refers to the circumstance of a permanent online adaptation of the analytical model according to the constantly received data flow over time [4,5,13]. This technique has been used to fulfill adequately various learning analytics objectives among which predicting students performance [1,6,7], image classification [10,14] and text classification [11]. Most of the existing works focus either on solving the problem of catastrophic forgetting and concept drift, or on comparing incremental online algorithms.…”
Section: Related Workmentioning
confidence: 99%
“…After (Salzberg, 1991;Wettschereck & Dietterich, 1995), modifications to the NGE theory were presented in (Martin, 1995), modifications to the classification algorithms were proposed in (Zaharie et al, 2011) and implementations of these classification algorithms for applications such as clustering or predicting student's performance are explored in (Hamidzadeh et al, 2015) and (Kulkarni & Ade, 2014), respectively. Nevertheless, no other modifications to the attribute weight learning algorithms of (Salzberg, 1991;Wettschereck & Dietterich, 1995) have been proposed.…”
Section: Attribute Weight Learningmentioning
confidence: 99%
“…The simplest way to deal with such lost data is to neglect those instances that have missing attributes. When large portions of the data is facing problem of missing features or lost attributes filtering or deletion (list wise) are suboptimal (and impractical due to conditions applied on them to work well) approaches and generally known as filtering [85]. A more realistic approach is imputation where missing values are filled with a meaningful calculation [86][87][88][89] [84] produces sufficient frequency of classifiers whereby each one is trained with randomly chosen feature subset.…”
Section: Issues Of Data Stream Learning -Missing Featuresmentioning
confidence: 99%