2018
DOI: 10.1109/access.2018.2801813
|View full text |Cite
|
Sign up to set email alerts
|

Fast and Lightweight Object Detection Network: Detection and Recognition on Resource Constrained Devices

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 24 publications
(13 citation statements)
references
References 35 publications
0
13
0
Order By: Relevance
“…For many inferencing tasks, general-purpose processors are not required, so researchers found that GPUs provide full resolution of floating-point arithmetic therefore network pruning uses post-training DNN analysis to eliminate connections between neurons that have little or no effect on performance [10]. The needless calculations and, as an additional benefit, significantly reduce the energy consumed by memory accesses due to the deletion of their connections.…”
Section: Machine Learning In Embedded Systemsmentioning
confidence: 99%
See 1 more Smart Citation
“…For many inferencing tasks, general-purpose processors are not required, so researchers found that GPUs provide full resolution of floating-point arithmetic therefore network pruning uses post-training DNN analysis to eliminate connections between neurons that have little or no effect on performance [10]. The needless calculations and, as an additional benefit, significantly reduce the energy consumed by memory accesses due to the deletion of their connections.…”
Section: Machine Learning In Embedded Systemsmentioning
confidence: 99%
“…One of it's state-of -the-art criteria for tracking of an object is the visualization challenge [8] as well as the results of past years have indeed been focused on both deep features [11] and deep learning techniques [9]. The developing and challenging of algorithms and CNNs endlessly to become more accurate and quicker but are typically evaluated to benchmark databases [10] on strong hardware platforms. At the other side, designers and researchers are looking to diversify the hardware available options to provide all resources required for implementation to improve efficiency of such challenging networks.…”
Section: Introductionmentioning
confidence: 99%
“…Oliveira et al [13] proposed a Fast and Light Weight Object Detection (FLODNet) model based on Convolutionalneural network which is faster than CNN model and can execute in CPU within few seconds. The FLODNet is a shallow convolutional neural network with 10 layers and fixed kernel size.…”
Section: Related Workmentioning
confidence: 99%
“…Real-time object detection techniques have been applied to a variety of computer vision areas [1,2], such as object classification or object segmentation. Since it is mainly operated on the constrained environments, input images obtained from those environments can be deteriorated by camera noises or compression artifacts [3][4][5]. In particular, it is hard to detect objects from the images with low quality.…”
Section: Introductionmentioning
confidence: 99%