2019 International Joint Conference on Neural Networks (IJCNN) 2019
DOI: 10.1109/ijcnn.2019.8851955
|View full text |Cite
|
Sign up to set email alerts
|

Stable Network Morphism

Abstract: We present in this paper a systematic study on how to morph a well-trained neural network to a new one so that its network function can be completely preserved. We define this as network morphism in this research. After morphing a parent network, the child network is expected to inherit the knowledge from its parent network and also has the potential to continue growing into a more powerful one with much shortened training time. The first requirement for this network morphism is its ability to handle diverse m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
80
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 53 publications
(80 citation statements)
references
References 28 publications
0
80
0
Order By: Relevance
“…Another approach to speed up performance estimation is to initialize the weights of novel architectures based on weights of other architectures that have been trained before. One way of achieving this, dubbed network morphisms [64], allows modifying an architecture while leaving the function represented by the network unchanged [10,11,21,22]. This allows increasing capacity of networks successively and retaining high performance without requiring training from scratch.…”
Section: Performance Estimation Strategymentioning
confidence: 99%
“…Another approach to speed up performance estimation is to initialize the weights of novel architectures based on weights of other architectures that have been trained before. One way of achieving this, dubbed network morphisms [64], allows modifying an architecture while leaving the function represented by the network unchanged [10,11,21,22]. This allows increasing capacity of networks successively and retaining high performance without requiring training from scratch.…”
Section: Performance Estimation Strategymentioning
confidence: 99%
“…Network morphism. In [4,32], a systematic study has been done on how to morph a well-trained neural network into a new one so that its network function can be completely preserved for further training. This network morphism can constitute a severe attack against our watermark because it may be impossible to detect the embedded watermark if the topology of the host network is severely modified.…”
Section: Future Workmentioning
confidence: 99%
“…Network morphism. In [8,51], a systematic study has been conducted on how to morph a well-trained neural network into a new one so that its network function can be completely preserved for further training. This network morphism can constitute a severe attack against our watermark because it may be impossible to detect the embedded watermark if the topology of the host network undergoes major modification.…”
Section: Further Expected Developmentsmentioning
confidence: 99%