2022
DOI: 10.1016/j.cor.2021.105676
|View full text |Cite
|
Sign up to set email alerts
|

Efficient and sparse neural networks by pruning weights in a multiobjective learning approach

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(5 citation statements)
references
References 13 publications
0
5
0
Order By: Relevance
“…In a more general setting, multiobjective training approaches were suggested in [20] to trade off between data loss and regularization terms in the context of image recognition. The different characteristics (slope and curvature) of the considered training goals are addressed by enhancing the stochastic multigradient descent approach [17] with pruning strategies, and by combining adap-tive weighted-sum scalarizations with interval bisection.…”
Section: Related Researchmentioning
confidence: 99%
See 1 more Smart Citation
“…In a more general setting, multiobjective training approaches were suggested in [20] to trade off between data loss and regularization terms in the context of image recognition. The different characteristics (slope and curvature) of the considered training goals are addressed by enhancing the stochastic multigradient descent approach [17] with pruning strategies, and by combining adap-tive weighted-sum scalarizations with interval bisection.…”
Section: Related Researchmentioning
confidence: 99%
“…Algorithm 1 summarizes the implementation of the bisection enhanced dichotomic search (BEDS) that enhances the dichotomic search scheme by an occasional bisection step. It is based on [20] and considers the fact that NN training with a certain weight parameter α may yield a local minimum, i.e., the training (here the Adam optimizer) may terminate in a local minimum and not in a global minimum as assumed in the general dichotomic search procedure.…”
Section: ] (Own Illustration)mentioning
confidence: 99%
“…Sparse control methods offer a solution by leveraging the sparsity or locality properties of system dynamics. To address such a problem, Reiners et al studied and verified the efficiency of sparse neural networks (SNNs) [30]. By focusing control inputs on key aspects of the system dynamics, sparse control methods achieve more efficient control.…”
Section: Introductionmentioning
confidence: 99%
“…Due to the centrality of the problem, alternative approaches have been proposed in the literature. Among many, Reiners et al (2022) and Burke and Flanders (1995) explain how a key aspect in the learning and training of deep neural network is the capability of correctly selecting the network architecture. An optimal network structure ensures not only to avoid overparameterization and overfitting, but also to improve the optimization performance.…”
Section: Introductionmentioning
confidence: 99%
“…Burke and Flanders (1995) propose to leverage a new class of networks called ontogenic neural networks that iteratively learn the optimal network structure during backpropagation via prototype units. Conversely, Reiners et al (2022), suggest to consider a biobjective optimization problem. Implementing such a strategy entails differentiating -while training the neural network -between a measure used for prediction accuracy evaluation (e.g., cross-entropy in classification problems), and a penalty function used to assess the total complexity of the network parameters.…”
Section: Introductionmentioning
confidence: 99%