2022
DOI: 10.1038/s41586-021-04223-6
|View full text |Cite
|
Sign up to set email alerts
|

Deep physical neural networks trained with backpropagation

Abstract: Deep-learning models have become pervasive tools in science and engineering. However, their energy requirements now increasingly limit their scalability1. Deep-learning accelerators2–9 aim to perform deep learning energy-efficiently, usually targeting the inference phase and often by exploiting physical substrates beyond conventional electronics. Approaches so far10–22 have been unable to apply the backpropagation algorithm to train unconventional novel hardware in situ. The advantages of backpropagation have … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
152
0
2

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 396 publications
(192 citation statements)
references
References 56 publications
2
152
0
2
Order By: Relevance
“…We demonstrate a few examples of utilizing the developed optical accelerator in the applications of image classification and materials discovery. The prediction accuracy in these applications is improved through physics-aware training process 28 . Furthermore, we show that the developed O-GEMM hardware accelerator can be further employed in the RL algorithms to accelerate the chip design of another optical accelerator.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…We demonstrate a few examples of utilizing the developed optical accelerator in the applications of image classification and materials discovery. The prediction accuracy in these applications is improved through physics-aware training process 28 . Furthermore, we show that the developed O-GEMM hardware accelerator can be further employed in the RL algorithms to accelerate the chip design of another optical accelerator.…”
Section: Resultsmentioning
confidence: 99%
“…Although the material and device parameter space in our demonstration is not gigantic, the developed highly-parallelized hardware emulator enables the further exploration of various RL algorithms in large-scale optimization problems. Moreover, the demonstrated physics-aware training approaches lay out strategies on how to deploy physical hardware systems to different ML application scenarios 28 .…”
Section: Discussionmentioning
confidence: 99%
“…Indeed, there have been a myriad of scholarly attempts to construct learning algorithms within purely physical systems [21][22][23][24][25][26]. Here, we will focus on "equilibrium propagation" [24,27] and "coupled learning" [25].…”
Section: Introductionmentioning
confidence: 99%
“…The inevitable loss of phase information during these conversions leads to a substantial measurement deviation from the computer-developed model. Additional adjustment and iterative training for error corrections are needed 15,22 , and the reconfigurability and transfer to other tasks are still laborious to accomplish. Multiple conversions between electrical and optical domains also weaken the advantages of employing optical approaches over electrical ones.…”
Section: Introductionmentioning
confidence: 99%