2019
DOI: 10.48550/arxiv.1904.09090
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

SCANN: Synthesis of Compact and Accurate Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
7
1

Relationship

7
1

Authors

Journals

citations
Cited by 8 publications
(22 citation statements)
references
References 0 publications
0
22
0
Order By: Relevance
“…[22], [23], [24] propose to grow part of connections based on the gradients of loss and prune it back to the desired sparsity. Similar efforts with different growing criterion are made in [25], [26], [27]. While all the non-zero weights in the existing grow-and-prune works are free to be updated, our work fundamentally differs in that the tiny sub-model is permanently frozen during all processes to guarantee the weight hierarchy.…”
Section: Related Workmentioning
confidence: 99%
“…[22], [23], [24] propose to grow part of connections based on the gradients of loss and prune it back to the desired sparsity. Similar efforts with different growing criterion are made in [25], [26], [27]. While all the non-zero weights in the existing grow-and-prune works are free to be updated, our work fundamentally differs in that the tiny sub-model is permanently frozen during all processes to guarantee the weight hierarchy.…”
Section: Related Workmentioning
confidence: 99%
“…Han et al [32] have shown the effectiveness of pruning in removing redundancy in CNNs and multilayer-perceptron architectures. Grow-andprune DNN synthesis uses network growth followed by network pruning in an iterative process to improve model performance while ensuring its compactness [14], [15].…”
Section: Efficient Neural Network Synthesismentioning
confidence: 99%
“…Hence, MHDeep uses synthetic data drawn from the same probability distribution as real data to augment the dataset. It also leverages a grow-andprune DNN synthesis approach [14], [15] to train accurate and computationally efficient neural network models to detect the mental health condition of the user.…”
Section: Introductionmentioning
confidence: 99%
“…Since the dataset is very small, consisting of just 140 datapoints, it prevented us from being able to adequately train a neural network [52]. However, when our methodology is applied to a larger scope of cyberattacks, a neural network model might be an effective tool [53].…”
Section: Applying Machine Learningmentioning
confidence: 99%