2022
DOI: 10.1016/j.knosys.2022.108274
|View full text |Cite
|
Sign up to set email alerts
|

Bi-CLKT: Bi-Graph Contrastive Learning based Knowledge Tracing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
20
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 167 publications
(26 citation statements)
references
References 18 publications
0
20
0
Order By: Relevance
“…In this paper, we propose a fused multi-feature knowledge tracking model SGRUR@MECF, which first uses the LightGBM algorithm to evaluate the importance of the features in the dataset, selects the features with high importance as input features and performs one-hot encoding and cross-coding of the correct and additional features of student responses. Since the cross-coding of multiple features leads to the high dimensionality of the input data, an autoencoder is used to compress the cross-multiple features, and then the compressed features are input to the prediction model [8,9].…”
Section: Methodsmentioning
confidence: 99%
“…In this paper, we propose a fused multi-feature knowledge tracking model SGRUR@MECF, which first uses the LightGBM algorithm to evaluate the importance of the features in the dataset, selects the features with high importance as input features and performs one-hot encoding and cross-coding of the correct and additional features of student responses. Since the cross-coding of multiple features leads to the high dimensionality of the input data, an autoencoder is used to compress the cross-multiple features, and then the compressed features are input to the prediction model [8,9].…”
Section: Methodsmentioning
confidence: 99%
“…KSGKT [29] is an improvement over GIKT as it alleviates graph sparsity. In view of the problems encountered by traditional GNN-based knowledge tracking models, Song et al [30] proposed bi-graph contrastive learning-based knowledge tracing (Bi-CLKT) which consists of three parts: subgraph establishing, contrastive learning and performance prediction.…”
Section: Graph Neural Network' Application In Educationmentioning
confidence: 99%
“…Recently, approaches to construct graph-based explanations using graph neural networks have been proposed [25,3,26,27,28]. However, the explanation is still given in terms of gradient footprints of the deep learning model rather than the characteristics of the rumour itself.…”
Section: Related Workmentioning
confidence: 99%
“…Graph convolutional network (GCN). Given the above requirements, a Graph Convolutional Network (GCN) [36,26,27,28] provides a starting point for our embedding model. A GCN directly reflects the structure of a graph and the feature vectors assigned to nodes.…”
Section: Node Embeddingmentioning
confidence: 99%