2022
DOI: 10.1109/access.2022.3192515
|View full text |Cite
|
Sign up to set email alerts
|

ENOS: Energy-Aware Network Operator Search in Deep Neural Networks

Abstract: This work proposes a novel Energy-aware Network Operator Search (ENOS) approach to address the energy-accuracy trade-offs of a deep neural network (DNN) accelerator. In recent years, novel hardwarefriendly inference operators such as binary-weight, multiplication-free, and deep-shift have been proposed to improve the computational efficiency of a DNN accelerator. However, simplifying DNN operators invariably comes at lower accuracy, especially on complex processing tasks. While prior works generally implement … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 11 publications
(4 citation statements)
references
References 32 publications
0
4
0
Order By: Relevance
“…Finally, a large amount of prior work has been done on building better edge processors [44][45][46][47][48][49][50][51][52]. Our work does not aim to inform the design of new hardware for machine learning but rather to provide insights and an evaluation methodology for developing machine learning models for edge tensor processors.…”
Section: Related Workmentioning
confidence: 99%
“…Finally, a large amount of prior work has been done on building better edge processors [44][45][46][47][48][49][50][51][52]. Our work does not aim to inform the design of new hardware for machine learning but rather to provide insights and an evaluation methodology for developing machine learning models for edge tensor processors.…”
Section: Related Workmentioning
confidence: 99%
“…For instance, analog computations simplify processing cells and enable the integration of storage and computations within a single cell in many compute-in-memory designs. This significantly reduces data movement during deep learning computations, which is a critical bottleneck for energy and performance in traditional processors [31], [32], [33], [34].…”
Section: Analog-domain Frequency Transformsmentioning
confidence: 99%
“…This Section proposes an online quality detection model for RSW by combining the CNN-LSTM network and attention mechanism, as shown in Figure 3. The model first uses the CNN-LSTM network to automatically extract local detail features and time-series correlation features of the welding data, and then uses the attention mechanism to focus on the critical main features that are most useful for the output results, improving the model's effectiveness and endowing the model with interpretability [33,34]. The proposed model consists of five parts: an input layer, convolutional neural network (CNN) layer, long short-term memory network (LSTM) layer, attention mechanism layer and output layer.…”
Section: Online Quality Detection Model For Rswmentioning
confidence: 99%