2024
DOI: 10.1109/tai.2023.3322394
|View full text |Cite
|
Sign up to set email alerts
|

Minimizing Parameter Overhead in Self-Supervised Models for Target Task

Jaydeep Kishore,
Snehasis Mukherjee
Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 45 publications
0
1
0
Order By: Relevance
“…They tested this approach with SimClr and SWAV on datasets like CIFAR-10, CIFAR-100, and Tiny ImageNet, and the results looked promising in comparison to other methods. Furthermore, this study was expanded by employing Neural Architecture Search (NAS) to optimize the number of parameters [26]. NAS aids in identifying a suitable neural network tailored to specific tasks, thereby reducing the human effort required to discover an optimal architecture for the given task [27].…”
Section: Table 1 Current Advancements In Hyperparameter Optimizationmentioning
confidence: 99%
“…They tested this approach with SimClr and SWAV on datasets like CIFAR-10, CIFAR-100, and Tiny ImageNet, and the results looked promising in comparison to other methods. Furthermore, this study was expanded by employing Neural Architecture Search (NAS) to optimize the number of parameters [26]. NAS aids in identifying a suitable neural network tailored to specific tasks, thereby reducing the human effort required to discover an optimal architecture for the given task [27].…”
Section: Table 1 Current Advancements In Hyperparameter Optimizationmentioning
confidence: 99%