2022
DOI: 10.1007/s10462-022-10321-2
|View full text |Cite
|
Sign up to set email alerts
|

A survey of graph neural networks in various learning paradigms: methods, applications, and challenges

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
3
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 43 publications
(12 citation statements)
references
References 216 publications
0
12
0
Order By: Relevance
“…GNNs are commonly employed for processing graph data, which are composed of nodes and edges connecting the nodes, denoted as G = ( V , E ), where V and E represent the nodes and edges of the graph, respectively . Chemical molecules can be represented as natural graphs, where atoms correspond to the nodes of the graph, and chemical bonds between atoms correspond to the edges.…”
Section: Data Description and Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…GNNs are commonly employed for processing graph data, which are composed of nodes and edges connecting the nodes, denoted as G = ( V , E ), where V and E represent the nodes and edges of the graph, respectively . Chemical molecules can be represented as natural graphs, where atoms correspond to the nodes of the graph, and chemical bonds between atoms correspond to the edges.…”
Section: Data Description and Methodsmentioning
confidence: 99%
“…GNNs are commonly employed for processing graph data, which are composed of nodes and edges connecting the nodes, denoted as G = (V, E), where V and E represent the nodes and edges of the graph, respectively. 43 Chemical molecules can be represented as natural graphs, where atoms correspond to the nodes of the graph, and chemical bonds between atoms correspond to the edges. In traditional GNN models, only the feature information on nodes and the connectivity between nodes are considered during the computation process, without further utilizing other features of the edges.…”
Section: ■ Introductionmentioning
confidence: 99%
“…The actual embedding gradients g e are amalgamated with the pseudo item embedding gradients g p , and the unified gradient of the model g m and embeddings g e on the device is modified as: g = (g m , g e , g p ) (10) The second strategy involves incorporating Laplace noise into the feature embeddings processed through the GCN model to ensure local difference privacy. Subsequently, we clip the embeddings based on their L1-norm with a threshold δ, and introduce a LDP module with zero-mean Laplacian noise to the unified embeddings to fortify user privacy protection.…”
Section: The Privacy Protection Modulementioning
confidence: 99%
“…It does not emphasise studies [ 17 , 18 , 19 ] that used graph-structured data from other medical sources, such as clinical data and longitudinal patient survey data. Meanwhile, Waikhom and Patgiri [ 20 ] reviewed the literature on using graph neural networks in various learning paradigms, including addressing the common formatting of graphical information and general standards or schemas that exist for the construction of graphical knowledge. However, no study in the present literature reviews disease prediction using graph ML approaches based on electronic health data.…”
Section: Introductionmentioning
confidence: 99%