Graph data, such as citation networks, social networks, and transportation networks, are prevalent in the real world. Graph neural networks (GNNs) have gained widespread attention for their robust expressiveness and exceptional performance in various graph analysis applications. However, the efficacy of GNNs is heavily reliant on sufficient data labels and complex network models, with the former being challenging to obtain and the latter requiring expensive computational resources. To address the labeled data scarcity and high complexity of GNNs, Knowledge Distillation (KD) has been introduced to enhance existing GNNs. This technique involves transferring the soft-label supervision of the large teacher model to the small student model while maintaining prediction performance. Transferring the KD technique to graph data and graph-based knowledge is a major challenge. This survey offers a comprehensive overview of Graph-based Knowledge Distillation methods, systematically categorizing and summarizing them while discussing their limitations and future directions. This paper first introduces the background of graph and KD. It then provides a comprehensive summary of three types of Graph-based Knowledge Distillation methods, namely Graph-based Knowledge Distillation for deep neural networks (DKD), Graph-based Knowledge Distillation for GNNs (GKD), and Self-Knowledge Distillation based Graph-based Knowledge Distillation (SKD). Each type of method is further divided into knowledge distillation methods based on the output layer, middle layer, and constructed graph. Subsequently, various graph-based knowledge distillation algorithms' ideas are analyzed and compared, concluding with the advantages and disadvantages of each algorithm supported by experimental results. In addition, the applications of graph-based knowledge distillation in computer vision, natural language processing, recommendation systems, and other fields are listed. Finally, the development of graph-based knowledge distillation is summarized and prospectively discussed. We have also released related resources at https://github.com/liujing1023/Graph-based-Knowledge-Distillation.