2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW) 2019
DOI: 10.1109/iccvw.2019.00251
|View full text |Cite
|
Sign up to set email alerts
|

SqueezeNAS: Fast Neural Architecture Search for Faster Semantic Segmentation

Abstract: For real time applications utilizing Deep Neural Networks (DNNs), it is critical that the models achieve high-accuracy on the target task and low-latency inference on the target computing platform. While Neural Architecture Search (NAS) has been effectively used to develop low-latency networks for image classification, there has been relatively little effort to use NAS to optimize DNN architectures for other vision tasks. In this work, we present what we believe to be the first proxyless hardware-aware search … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
61
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3
2

Relationship

1
9

Authors

Journals

citations
Cited by 62 publications
(61 citation statements)
references
References 34 publications
0
61
0
Order By: Relevance
“…Other pseudo-3D CNN model uses adjacent slices as 3D input, but 2D convolution kernels as reported in 48,49 or 3D CNN model, which uses volume as inputs and 3D convolutions by Labonte et al 31 and an associated uncertainty metric 50 can be further investigated. There are also some emerging automatic techniques 51,52 searching optimal CNN architecture that could be potentially deployed in our current cases. Future direction might be to train a versatile network with a larger dataset for a specific collection of material.…”
Section: Discussionmentioning
confidence: 99%
“…Other pseudo-3D CNN model uses adjacent slices as 3D input, but 2D convolution kernels as reported in 48,49 or 3D CNN model, which uses volume as inputs and 3D convolutions by Labonte et al 31 and an associated uncertainty metric 50 can be further investigated. There are also some emerging automatic techniques 51,52 searching optimal CNN architecture that could be potentially deployed in our current cases. Future direction might be to train a versatile network with a larger dataset for a specific collection of material.…”
Section: Discussionmentioning
confidence: 99%
“…The combination of all of these potential techniques opens up a broad search-space of neural architecture designs for NLP. This motivates the application of automated neural architecture search (NAS) approaches such as those described in (Shaw et al, 2019;Wu et al, 2019a) to further improve the design of neural networks for NLP.…”
Section: Discussionmentioning
confidence: 99%
“…It offers potential for automated architecture search, especially when the demand for customized networks rises. FBnetV3 [25], SqueezeNAS [26] and Proxyless NAS [15] focus on low-latency network architecture search for mobile devices. They were developed to replace costly redesign DNNs for certain tasks on certain platforms.…”
Section: Related Workmentioning
confidence: 99%