2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2020
DOI: 10.1109/cvpr42600.2020.01385
|View full text |Cite
|
Sign up to set email alerts
|

Improving One-Shot NAS by Suppressing the Posterior Fading

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
36
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
3
2

Relationship

1
8

Authors

Journals

citations
Cited by 74 publications
(36 citation statements)
references
References 1 publication
0
36
0
Order By: Relevance
“…Although channel attention has shown the highest model stability in our experiments, other attention modules could also be utilized to fulfill additional tasks. Powerful spatial attention modules such as the non-local layer or graph reasoning layer could be optimized and then applied to our system for the accurate classification of myocardial scar in different coronary artery territories ( 32 , 33 ). Since the image-shifting is a commonly encountered challenge, newer outlier detection algorithms could be utilized to shift the generated images more effectively.…”
Section: Discussionmentioning
confidence: 99%
“…Although channel attention has shown the highest model stability in our experiments, other attention modules could also be utilized to fulfill additional tasks. Powerful spatial attention modules such as the non-local layer or graph reasoning layer could be optimized and then applied to our system for the accurate classification of myocardial scar in different coronary artery territories ( 32 , 33 ). Since the image-shifting is a commonly encountered challenge, newer outlier detection algorithms could be utilized to shift the generated images more effectively.…”
Section: Discussionmentioning
confidence: 99%
“…Recently, Liu et al [34] propose a DARTs model that replaces the discrete searching process with a continuously differentiable strategy, allowing gradient descent-based architecture optimization and resulting in exponentially faster searching speed. However, such continuously differentiable strategy is less likely to explore an optimal architecture [35], [36], [37], [38] compared with RLbased methods. Fig.…”
Section: B Neural Architecture Searchmentioning
confidence: 99%
“…Because inefficient search strategies require a large number of GPUs, many NAS methods cannot be implemented given limited computational resources. To address these challenges, much recent work dedicates to developing effective methods which can reduce the computational costs of performance evaluation, e.g., surrogate-assisted evolutionary algorithms (SAEAs) [33,42,43], information reuse [44,45], one-shot neural architecture search [46][47][48][49][50], among many others.…”
Section: Neural Architecture Searchmentioning
confidence: 99%