Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining 2023
DOI: 10.1145/3580305.3599256
|View full text |Cite
|
Sign up to set email alerts
|

All in One: Multi-Task Prompting for Graph Neural Networks

Abstract: Recently, "pre-training and fine-tuning" has been adopted as a standard workflow for many graph tasks since it can take general graph knowledge to relieve the lack of graph annotations from each application. However, graph tasks with node level, edge level, and graph level are far diversified, making the pre-training pretext often incompatible with these multiple tasks. This gap may even cause a "negative transfer" to the specific application, leading to poor results. Inspired by the prompt learning in natural… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 37 publications
(9 citation statements)
references
References 33 publications
0
9
0
Order By: Relevance
“…We present more details of the baselines in Appendix B. It is worth noting that certain few-shot methodologies on graphs, such as Meta-GNN [64], AMM-GNN [47], RALE [29], VNT [42], and ProG [41], hinge on the meta-learning paradigm [15], requiring an additional set of labeled base classes in addition to the few-shot classes. Hence, they are not comparable to our framework.…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…We present more details of the baselines in Appendix B. It is worth noting that certain few-shot methodologies on graphs, such as Meta-GNN [64], AMM-GNN [47], RALE [29], VNT [42], and ProG [41], hinge on the meta-learning paradigm [15], requiring an additional set of labeled base classes in addition to the few-shot classes. Hence, they are not comparable to our framework.…”
Section: Methodsmentioning
confidence: 99%
“…Due to the parameter-efficient nature of prompt, it has been quickly popularized in favor of finetuning larger pre-trained models, or when the downstream task only has few-shot labels. Given the advantages, prompt-based learning has also been explored on graphs [30,[40][41][42]58].…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Simultaneously, the integration of graph neural networks (GNNs) has emerged as a pivotal innovation, enhancing the representation of complex entity relationships essential for multi-hop QA. The introduction of methods like multi-task prompting for graph neural networks highlights the potential of leveraging pre-trained models across various graph tasks, effectively bridging the gap between general graph knowledge and specific application needs [17]. The exploration of hypergraph representation for sociological analysis emphasizes the richness of social interactions and environments, providing a novel approach to understanding complex sociological phenomena through data mining techniques [18].…”
Section: Introductionmentioning
confidence: 99%