Proceedings of the 31st ACM International Conference on Information &Amp; Knowledge Management 2022
DOI: 10.1145/3511808.3557628
|View full text |Cite
|
Sign up to set email alerts
|

Leveraging the Graph Structure of Neural Network Training Dynamics

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 11 publications
0
2
0
Order By: Relevance
“…They proposed "Unroll", and converting convolutional layers into graphs. Then, Vahedian et al [34] proposed a "Rolled" graph representation of convolutional layers to solve the DNN performance prediction problem by capturing the early DNN dynamics during the training phase. To maintain the semantic meaning of the convolutional layers, they represent each filter as a node and link the filters in successive layers by weighted edges.…”
Section: Graph Structure Of Dnnsmentioning
confidence: 99%
See 1 more Smart Citation
“…They proposed "Unroll", and converting convolutional layers into graphs. Then, Vahedian et al [34] proposed a "Rolled" graph representation of convolutional layers to solve the DNN performance prediction problem by capturing the early DNN dynamics during the training phase. To maintain the semantic meaning of the convolutional layers, they represent each filter as a node and link the filters in successive layers by weighted edges.…”
Section: Graph Structure Of Dnnsmentioning
confidence: 99%
“…Existing DAG studies have been able to convert the DNN into the graph, discussing the dynamic properties of the DNN model [32][33][34][35]. They used an undirected weighted graph to treat neurons as nodes and the weights of the model as weighted edges between nodes.…”
Section: Overviewmentioning
confidence: 99%