2021
DOI: 10.1109/mnet.001.2100266
|View full text |Cite
|
Sign up to set email alerts
|

IGNNITION: Bridging the Gap between Graph Neural Networks and Networking Systems

Abstract: Recent years have seen the vast potential of Graph Neural Networks (GNN) in many fields where data is structured as graphs (e.g., chemistry, recommender systems). In particular, GNNs are becoming increasingly popular in the field of networking, as graphs are intrinsically present at many levels (e.g., topology, routing). The main novelty of GNNs is their ability to generalize to other networks unseen during training, which is an essential feature for developing practical Machine Learning (ML) solutions for net… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 19 publications
(6 citation statements)
references
References 7 publications
0
6
0
Order By: Relevance
“…GNN models are tested in situations in which the number of PMUs is optimal (with minimal measurement redundancy under which the WLS SE provides a solution), in underdetermined scenarios, as well as in scenarios with maximal measurement redundancy. We used the IGNNITION framework [25] for…”
Section: Numerical Resultsmentioning
confidence: 99%
“…GNN models are tested in situations in which the number of PMUs is optimal (with minimal measurement redundancy under which the WLS SE provides a solution), in underdetermined scenarios, as well as in scenarios with maximal measurement redundancy. We used the IGNNITION framework [25] for…”
Section: Numerical Resultsmentioning
confidence: 99%
“…In this section, we describe the GNN model's training process and test the trained model on various examples to validate its accuracy, and its robustness under measurement data loss due to communication failure and cyber attacks in the form of malicious data injections. The described GNN model for augmented factor graphs is implemented using the IGNNITION library [22] and trained for 100 epochs in all experiments using the following hyperparameters: 64 elements in the node embedding vector, 32 graphs in a mini-batch, four GNN layers, ReLU activation functions, Adam optimiser, and a learning rate of 4 × 10 −4 . We conducted separate training experiments for IEEE 30 and IEEE 118-bus test cases, for which we generated a training set containing 10000 samples and validation and test sets containing 100 samples each.…”
Section: Numerical Results and Discussionmentioning
confidence: 99%
“…In this section, we conduct comprehensive numerical tests to evaluate the effectiveness of proposed augmented factor graph-based GNN approaches for linear and nonlinear SE problems. We used the IGNNITION framework [151] for building and utilizing GNN models, with the hyperparameters presented in Table 4.1, the first three of which were obtained with the grid search hyperparameter optimization using the Tune tool [152]. All the results presented in this section are normalized using the corresponding nominal voltages in the test power systems and a base power of 100 MVA.…”
Section: Numerical Resultsmentioning
confidence: 99%