2020
DOI: 10.1109/tcsi.2019.2937227
|View full text |Cite
|
Sign up to set email alerts
|

Processing Near Sensor Architecture in Mixed-Signal Domain With CMOS Image Sensor of Convolutional-Kernel-Readout Method

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
30
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3
2

Relationship

3
6

Authors

Journals

citations
Cited by 42 publications
(30 citation statements)
references
References 25 publications
0
30
0
Order By: Relevance
“…However, traditional line-by-line scanning scheme is not suitable for real-time 2-D convolution, which is the basis of most image processing algorithms. In our work, kernel-data readout methods are applied, which can output 2-D kernel-sized data in a single clock cycle to support instant 2-D convolution operations without buffers [18].…”
Section: Tft Near-sensor Cim Systemmentioning
confidence: 99%
“…However, traditional line-by-line scanning scheme is not suitable for real-time 2-D convolution, which is the basis of most image processing algorithms. In our work, kernel-data readout methods are applied, which can output 2-D kernel-sized data in a single clock cycle to support instant 2-D convolution operations without buffers [18].…”
Section: Tft Near-sensor Cim Systemmentioning
confidence: 99%
“…D EEP neural networks (DNNs) have been powering a broad range of applications [3], [4], [5], [6], [7], including natural language processing, image classification, and object recognition. The demand to achieve high accuracy for increasingly computational tasks leads to ever-growing model sizes of modern DNNs [8], [9].…”
Section: Introductionmentioning
confidence: 99%
“…The absence of data format converters allows us to reduce the hardware cost. A frequently used option is near-sensor calculation in the analog form [2,3]. Another option is pulse form calculations, since the conversion of analog signals into pulse time parameters is not complicated.…”
Section: Introductionmentioning
confidence: 99%