Graph neural networks (GNN) have shown strong performance in link prediction tasks. However, it is susceptible to higher latency due to the trivial correlation of data in its neighborhood, which poses a challenge for its practical applica- tion. In contrast, although Multi-layer Perceptron (MLP) performs poorly, it has a shorter inference time and is more flexible in practical applications. We uti- lize a distillation model to combine the powerful inference capabilities of GNN with the inference effciency of MLP. Distillation models usually use a predefined distance function to quantify the differences between teacher-student networks, but this cannot be well applied to various complex scenarios. In addition, the limited node information severely affects the learning ability of MLP. Therefore, to cope with these problems. Firstly, we propose an Adversarial Generative Dis- criminator (AGD), which trains the discriminators and generators against each other to adaptively detect and reduce the differences. Secondly, we also propose the Feature Aggregation Module (FAM) to help the MLP obtain suffcient fea- ture information before distillation starts. In the experiments, it is shown that our approach can achieve good results in link prediction tasks, outperforming the baseline model Linkless Prediction (LLP) and maintaining a good inference speed on eight datasets in two different settings∗ .
∗The code on https://github.com/lwuen/LPVAKD.git