2019
DOI: 10.1002/adma.201808032
|View full text |Cite
|
Sign up to set email alerts
|

Synaptic Resistors for Concurrent Inference and Learning with High Energy Efficiency

Abstract: Transistor-based circuits with parallel computing architectures and distributed memories, such as graphics processing units (GPUs) from Nvidia, [9] tensor processing units (TPUs) from Google, [3,10] field-programmable gate arrays (FPGAs) from Intel, [11] and the TrueNorth neuromorphic circuit from IBM [12] have been developed to improve their energy efficiencies ( Figure 1a) to the range of 10 10 − 10 11 FLOPS W −1 (floating point operations per second per watt) by increasing parallelism and reducing global da… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
57
0
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 39 publications
(59 citation statements)
references
References 32 publications
0
57
0
1
Order By: Relevance
“…The human brain, consisting of 10 11 neurons and 10 15 synapses, can perform learning, calculating, thinking, and inference from the "big data" with an estimated speed of 10 16 floating point operations per second, which is comparable to the operating speed of the fastest supercomputer, Summit. [1,2] But, the energy consumption of the brain (20 W) is only about one millionth of that of Summit (10 7 W). The high efficiency of the brain has inspired scientists all over the world to mimic artificial synapses mainly focus on inorganic materials.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The human brain, consisting of 10 11 neurons and 10 15 synapses, can perform learning, calculating, thinking, and inference from the "big data" with an estimated speed of 10 16 floating point operations per second, which is comparable to the operating speed of the fastest supercomputer, Summit. [1,2] But, the energy consumption of the brain (20 W) is only about one millionth of that of Summit (10 7 W). The high efficiency of the brain has inspired scientists all over the world to mimic artificial synapses mainly focus on inorganic materials.…”
Section: Introductionmentioning
confidence: 99%
“…Herein, we report a new type of 2D OSC based synaptic transistor prepared by simple solution epitaxy [52,53] at room temperature. The p-type small molecule 2,7-dioctyl [1] benzothieno[3,2-b] [1] benzothiophene (C8-BTBT) was selected as the semiconductor material because of its high crystallinity and good solubility. The advantage of solution epitaxy is that 2D OSC films grow directly on the smooth water surface, which eliminates the effect of dielectric surface on the morphology of the deposited OSC film.…”
Section: Introductionmentioning
confidence: 99%
“…This compatibility was further confirmed by fabricating a 4 × 2 crossbar circuit using 72 synstor and 2 integrate-and-fire neuron circuits. Notably, this system shows a computational efficiency of ≈1.6 × 10 17 FLOPS W −1 that surpasses state-of-the-art supercomputers, [140] thus highlighting the potential of SWCNT electronics for next-generation neuromorphic hardware compared to other emerging materials (e.g., MoS 2 , perovskites, chalcogenides). [137]…”
Section: Neuromorphic Computingmentioning
confidence: 99%
“…With continuous attention, neuromorphic computing witnesses rapid development in many aspects, such as image processing [132,133] and speech signals recognition. [26] The human brain consists of around 10 11 neurons and 10 15 synapses, [131] which are connected with each other to form the complicated neural networks to process message coming into the brain with super-high efficiency. The contact part of two neurons is synapse, the basic unit to form the neural network.…”
Section: Interface Engineering In Synaptic Applicationsmentioning
confidence: 99%