2020 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC) 2020
DOI: 10.1109/cyberc49757.2020.00054
|View full text |Cite
|
Sign up to set email alerts
|

Convolutional Recurrent Neural Networks for Knowledge Tracing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 10 publications
0
7
0
Order By: Relevance
“…DKVMN [6] and Deep-IRT [7] on the seven described datasets. Note that the python code 11 used for the DKT model experiments requires that the train/test split is performed during code execution, thus the data files have been converted to the appropriate format required for the experimentation process.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…DKVMN [6] and Deep-IRT [7] on the seven described datasets. Note that the python code 11 used for the DKT model experiments requires that the train/test split is performed during code execution, thus the data files have been converted to the appropriate format required for the experimentation process.…”
Section: Methodsmentioning
confidence: 99%
“…Recently, methods from the first two categories were combined with deep learning methods, for example DBN [8] and DPFA [9], achieving improved results. Additionally, convolutional neural networks [10], [11], [12], [13] and graph neural networks models [14], [15] have been recently proposed for the KT task. The aim of all the works is the optimal representation of the knowledge state and the prediction of student performance so that the learning process can be improved and adapted to the student's learning needs.…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…In DKT, student responses serve as inputs, latent knowledge states are hidden layers, and the predicted probabilities of correctness for questions are outputs [41]. There are many variations of DKT as reported in [42], some changed the network structure [43], [44], while others include different faceted student information [45]- [50]. In general, DKT substantially saves the efforts of human experts, as it does not need to explicitly construct a prior domain knowledge model.…”
Section: B Data-driven Approachesmentioning
confidence: 99%
“…As deep learning develops, a lot of deep learning models have been applied in KT. Chris Piech applies the recurrent neural network (RNN) to model the student learning process for the rst time and proposes deep knowledge tracing (DKT) [6][7][8][9].…”
Section: Introductionmentioning
confidence: 99%