2020 IEEE Asian Solid-State Circuits Conference (A-Sscc) 2020
DOI: 10.1109/a-sscc48613.2020.9336139
|View full text |Cite
|
Sign up to set email alerts
|

Always-On, Sub-300-nW, Event-Driven Spiking Neural Network based on Spike-Driven Clock-Generation and Clock- and Power-Gating for an Ultra-Low-Power Intelligent Device

Abstract: Always-on artificial intelligent (AI) functions such as keyword spotting (KWS) and visual wake-up tend to dominate total power consumption in ultra-low power devices [1]. A key observation is that the signals to an always-on function are sparse in time, which a spiking neural network (SNN) classifier can leverage for power savings, because the switching activity and power consumption of SNNs tend to scale with spike rate. Toward this goal, we present a novel SNN classifier architecture for always-on functions,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 19 publications
(2 citation statements)
references
References 15 publications
0
2
0
Order By: Relevance
“…One of the key advantages of SNNs lies in their event‐driven processing nature, which results in reduced power consumption and improved efficiency compared to traditional ANNs that rely on computations of continuous values. [ 20 ] Additionally, SNNs have shown promise in implementing various neural information processing tasks, such as pattern recognition, associative memory, and temporal processing.…”
Section: Resultsmentioning
confidence: 99%
“…One of the key advantages of SNNs lies in their event‐driven processing nature, which results in reduced power consumption and improved efficiency compared to traditional ANNs that rely on computations of continuous values. [ 20 ] Additionally, SNNs have shown promise in implementing various neural information processing tasks, such as pattern recognition, associative memory, and temporal processing.…”
Section: Resultsmentioning
confidence: 99%
“…We compare our proposed approach with other state-of-theart SNN macros [9]- [11] and digital CIM macros [12]- [14] (Table I). Among the SNN macros, [9] has a poor energy efficiency due to time-based digital oscillator circuits for implementing neuron functionality, while [11] has 2.7× lower energy-efficiency (assuming linear scaling with bit-precision) compared to our design due to very low-frequency operation. Ref.…”
Section: Implementation Resultsmentioning
confidence: 99%