2023
DOI: 10.1109/jssc.2022.3200515
|View full text |Cite
|
Sign up to set email alerts
|

8-b Precision 8-Mb ReRAM Compute-in-Memory Macro Using Direct-Current-Free Time-Domain Readout Scheme for AI Edge Devices

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 26 publications
(7 citation statements)
references
References 36 publications
0
7
0
Order By: Relevance
“…Weight shifting with compensation was implemented in three stages: (i) weight shifting to increase the number of HRS cells in the cell array; (ii) dot-product computing based on shifted weight data; and (iii) compensation for dot-product results in accordance with the weight shifting value. When storing the weight data of a neural network into the memristor-CIM, each 8-bit weight (W[7:0]) value is shifted by adding a positive bias (B Shift ; e.g., 16) by an on-chip shifting circuit. Note that in typical neural network models, small positive and negative weight values account for the largest proportion of the weights.…”
Section: Weight Shifting With Compensation In Memristor-cimmentioning
confidence: 99%
“…Weight shifting with compensation was implemented in three stages: (i) weight shifting to increase the number of HRS cells in the cell array; (ii) dot-product computing based on shifted weight data; and (iii) compensation for dot-product results in accordance with the weight shifting value. When storing the weight data of a neural network into the memristor-CIM, each 8-bit weight (W[7:0]) value is shifted by adding a positive bias (B Shift ; e.g., 16) by an on-chip shifting circuit. Note that in typical neural network models, small positive and negative weight values account for the largest proportion of the weights.…”
Section: Weight Shifting With Compensation In Memristor-cimmentioning
confidence: 99%
“…(1) The sampling structure, consisting of four sampling switches (SW3-SW6) and two sampling capacitors (C0, C1), samples the voltage values of V REFL , V REFH, and V SUM (V REFL = 1/4 VDD, V REFH = 3/4 VDD, V SUM is the input voltage that is quantized); (2) The LSB sensing includes three MOS transistors NO (N1), P0 (P1), P2 (P3), a switch SW1 (SW2), and an inverter, generating the OUT2 (OUT2B) signal for the LSBdetecting circuit; (3) The latch comprises two cascaded inverters formed by N2, P4 and N3, P5 along with two switching MOS transistors (N4, P6), producing SAOUT [1] by comparing the voltages at nodes Q1B and Q1; (4) The LSB-detecting circuit, a 2-to-1 selector, selects between OUT2 and not (OUT2B) based on the value of SAOUT [1] to determine the SAOUT[0] result.…”
Section: The Proposed Mql-vsamentioning
confidence: 99%
“…Concurrently, SW1 and SW2 are turned off, and N0, P0 and N1, P1 formed inverter structures, causing voltage swings at nodes Q2B and Q2 opposite to those at Q1B and Q1, respectively; (3) PH3. This phase accomplishes the output of SAOUT[0] and SAOUT [1]. As Figure 6c illustrates, with SAEN1 = 1, the voltages at Q2 and Q2B are processed through inverters to generate OUT2 and OUT2B, respectively.…”
Section: Workflow Of the Mql-vsamentioning
confidence: 99%
See 2 more Smart Citations