2023
DOI: 10.1109/tkde.2023.3250264
|View full text |Cite
|
Sign up to set email alerts
|

Automated Graph Neural Network Search Under Federated Learning Framework

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 24 publications
0
4
0
Order By: Relevance
“…Existing works pay attention to the GNNs architecture search problem. FL-AGNNS [69] proposes a federated evolutionary optimization strategy to search for suitable GCN architectures. It applies a GCN SuperNet and a weight-sharing strategy to reduce the searching time so that the proposed algorithm can find better GCN models in a short time.…”
Section: A Horizontal Fedgnnsmentioning
confidence: 99%
“…Existing works pay attention to the GNNs architecture search problem. FL-AGNNS [69] proposes a federated evolutionary optimization strategy to search for suitable GCN architectures. It applies a GCN SuperNet and a weight-sharing strategy to reduce the searching time so that the proposed algorithm can find better GCN models in a short time.…”
Section: A Horizontal Fedgnnsmentioning
confidence: 99%
“…The proposed aggregation algorithms are based on the FedAVG , optimal transportation (Y. Lin et al, 2020), SGNN, or Huber loss function. Regarding the tuning of hyperparameters and the network structure, the authors have applied Bayesian optimization (L. Zheng et al, 2021) and evolutionary optimization strategies (C. Wang, Chen, et al, 2021).…”
Section: Fl Applicationsmentioning
confidence: 99%
“…DP achieves this by adding a controlled amount of noise to the data before it is transmitted to the aggregator. The level of noise is determined by a privacy parameter, which plays a role in balancing the trade-off, as explained in reference [34]. This parameter connects the model to the process of updating from multiple devices, thus enabling the model to improve collectively [37].…”
Section: Background and Related Workmentioning
confidence: 99%