2020
DOI: 10.48550/arxiv.2006.09083
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Reusing Trained Layers of Convolutional Neural Networks to Shorten Hyperparameters Tuning Time

Abstract: Hyperparameters tuning is a time-consuming approach, particularly when the architecture of the neural network is decided as part of this process. For instance, in convolutional neural networks (CNNs), the selection of the number and the characteristics of the hidden (convolutional) layers may be decided. This implies that the search process involves the training of all these candidate network architectures.This paper describes a proposal to reuse the weights of hidden (convolutional) layers among different tra… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 5 publications
0
1
0
Order By: Relevance
“…We set λ 2 = λ 3 = 1 following the work [32] and set λ 1 to 1. Since tuning the hyperparameters is time-consuming [46,47], it may be possible to improve accuracy by adjusting these parameters.…”
Section: Segment-based Simclr With Sdfdmentioning
confidence: 99%
“…We set λ 2 = λ 3 = 1 following the work [32] and set λ 1 to 1. Since tuning the hyperparameters is time-consuming [46,47], it may be possible to improve accuracy by adjusting these parameters.…”
Section: Segment-based Simclr With Sdfdmentioning
confidence: 99%