Proceedings of the 52nd Annual IEEE/ACM International Symposium on Microarchitecture 2019
DOI: 10.1145/3352460.3358268
|View full text |Cite
|
Sign up to set email alerts
|

FlexLearn

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 19 publications
(5 citation statements)
references
References 60 publications
0
5
0
Order By: Relevance
“…Original SNNs Near-Memory Computing Analog-Digital-Mixed, Large-Scale BrainScaleS [87] SNNs, Learning Near-Memory Computing Analog-Digital-Mixed, Large-Scale SpiNNaker [81] SNNs, Learning Near-Memory Computing Digital, Large-Scale TrueNorth [14] SNNs Near-Memory Computing Digital, Large-Scale Darwin [83] SNNs Near-Memory Computing Digital, Small-Scale ROLLS [88] SNNs, Learning Near-Memory Computing Analog-Digital-Mixed, Small-Scale DYNAPs [94] SNNs Near-Memory Computing Analog-Digital-Mixed, Small-Scale Loihi [41] SNNs, Learning Near-Memory Computing Digital, Large-Scale Tianjic [85], [86] ANNs & SNNs Near-Memory Computing Digital, Large-Scale ODIN [89] SNNs, Learning Near-Memory Computing Digital, Small-Scale MorphIC [90] SNNs, Learning Near-Memory Computing Digital, Small-Scale DYNAPs-CNN/DYNAP-SE [44] SNNs Near-Memory Computing Digital, Small-Scale FlexLearn [99] SNNs, Learning ANN Accelerator Variants Digital, Large-Scale SpinalFlow [100] SNNs ANN Accelerator Variants Digital, Small-Scale H2Learn [101] SNNs, Learning ANN Accelerator Variants Digital, Large-Scale SATA [102] SNNs, Learning ANN Accelerator Variants Digital, Small-Scale BrainScaleS 2 [82] ANNs & SNNs, Learning Near-Memory Computing Digital, Large-Scale SpiNNaker 2 [95] ANNs & SNNs, Learning Near-Memory Computing Digital, Large-Scale Y. Kuang et al [282] ANNs & SNNs Near-Memory Computing Digital, Large-Scale SRAM/DRAM/Flash-based ANNs, SNNs In-Memory Computing Digital, Small-Scale Memristor-based ANNs, SNNs In-Memory Computing Analog-Digital-Mixed, Small-Scale neuromorphic workloads. The learning of SNNs on GPU is inefficient and hard to optimize [91].…”
Section: Chip Familymentioning
confidence: 99%
See 2 more Smart Citations
“…Original SNNs Near-Memory Computing Analog-Digital-Mixed, Large-Scale BrainScaleS [87] SNNs, Learning Near-Memory Computing Analog-Digital-Mixed, Large-Scale SpiNNaker [81] SNNs, Learning Near-Memory Computing Digital, Large-Scale TrueNorth [14] SNNs Near-Memory Computing Digital, Large-Scale Darwin [83] SNNs Near-Memory Computing Digital, Small-Scale ROLLS [88] SNNs, Learning Near-Memory Computing Analog-Digital-Mixed, Small-Scale DYNAPs [94] SNNs Near-Memory Computing Analog-Digital-Mixed, Small-Scale Loihi [41] SNNs, Learning Near-Memory Computing Digital, Large-Scale Tianjic [85], [86] ANNs & SNNs Near-Memory Computing Digital, Large-Scale ODIN [89] SNNs, Learning Near-Memory Computing Digital, Small-Scale MorphIC [90] SNNs, Learning Near-Memory Computing Digital, Small-Scale DYNAPs-CNN/DYNAP-SE [44] SNNs Near-Memory Computing Digital, Small-Scale FlexLearn [99] SNNs, Learning ANN Accelerator Variants Digital, Large-Scale SpinalFlow [100] SNNs ANN Accelerator Variants Digital, Small-Scale H2Learn [101] SNNs, Learning ANN Accelerator Variants Digital, Large-Scale SATA [102] SNNs, Learning ANN Accelerator Variants Digital, Small-Scale BrainScaleS 2 [82] ANNs & SNNs, Learning Near-Memory Computing Digital, Large-Scale SpiNNaker 2 [95] ANNs & SNNs, Learning Near-Memory Computing Digital, Large-Scale Y. Kuang et al [282] ANNs & SNNs Near-Memory Computing Digital, Large-Scale SRAM/DRAM/Flash-based ANNs, SNNs In-Memory Computing Digital, Small-Scale Memristor-based ANNs, SNNs In-Memory Computing Analog-Digital-Mixed, Small-Scale neuromorphic workloads. The learning of SNNs on GPU is inefficient and hard to optimize [91].…”
Section: Chip Familymentioning
confidence: 99%
“…To solve this challenge, some works design BIC chips that can support learning rules. For example, ROLLS, ODIN, and MorphIC support the spike-driven synaptic plasticity (SDSP) rules [88]- [90], Loihi adds learning modules for STDP rules [41], and FlexLearn further extends to a broader scope of supported synaptic plasticity rules [99]. In SpiNNaker and BrainScaleS, STDP learning has been demonstrated through time stamp recording and learning circuits [87], [92].…”
Section: Chip Familymentioning
confidence: 99%
See 1 more Smart Citation
“…Loihi [ 5 ] achieved enhanced learning capabilities through configurable sets of traces and delays. FlexLearn [ 6 ] has conceived a versatile data path that amalgamates key features from diverse models. Moreover, endeavors have been undertaken to develop fully configurable neuronal models using instructions.…”
Section: Introductionmentioning
confidence: 99%
“…However, prior works suffer from the inability to support, or partially support, biologically plausible neuron models, or synaptic learning rules. To address these challenges, Lee et al (2018) and Baek et al (2019) have presented programmable SNN hardware that supports a wide range of neuron models and synaptic learning rules. Another approach is to use an FPGA platform, which allows flexible modification of neuron models and network structures by reconfiguring the hardware architecture (Cheung et al, 2016;Sripad et al, 2018).…”
mentioning
confidence: 99%