2023
DOI: 10.1126/science.adi8474
|View full text |Cite
|
Sign up to set email alerts
|

Backpropagation-free training of deep physical neural networks

Ali Momeni,
Babak Rahmani,
Matthieu Malléjac
et al.

Abstract: Recent successes in deep learning for vision and natural language processing are attributed to larger models but come with energy consumption and scalability issues. Current training of digital deep-learning models primarily relies on backpropagation that is unsuitable for physical implementation. In this work, we propose a simple deep neural network architecture augmented by a physical local learning (PhyLL) algorithm, which enables supervised and unsupervised training of deep physical neural networks without… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
1
0

Year Published

2024
2024
2025
2025

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 22 publications
(1 citation statement)
references
References 70 publications
(83 reference statements)
0
1
0
Order By: Relevance
“…We also test the negative-log-likelihood (NLL) of the gener- One of the possible extensions of our work is to train the network physically [49,50]. This becomes critical when an accurate digital modeling of the physical system becomes challenging due to its complexity.…”
mentioning
confidence: 99%
“…We also test the negative-log-likelihood (NLL) of the gener- One of the possible extensions of our work is to train the network physically [49,50]. This becomes critical when an accurate digital modeling of the physical system becomes challenging due to its complexity.…”
mentioning
confidence: 99%