2020 57th ACM/IEEE Design Automation Conference (DAC) 2020
DOI: 10.1109/dac18072.2020.9218593
|View full text |Cite
|
Sign up to set email alerts
|

FLOPS: EFficient On-Chip Learning for OPtical Neural Networks Through Stochastic Zeroth-Order Optimization

Abstract: Optical neural networks (ONNs) have demonstrated recordbreaking potential in high-performance neuromorphic computing due to its ultra-high execution speed and low energy consumption. However, current learning protocols fail to provide scalable and efficient solutions to photonic circuit optimization in practical applications. In this work, we propose a novel on-chip learning framework to release the full potential of ONNs for power-efficient in situ training. Instead of deploying implementation-costly back-pro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
23
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3

Relationship

2
4

Authors

Journals

citations
Cited by 16 publications
(23 citation statements)
references
References 33 publications
0
23
0
Order By: Relevance
“…A detailed introduction to ONNs can be found in Appendix A. Beyond offline training [21], ONN on-chip training methods are proposed to offload the process back onto photonics [24,20,17], shown in Table 1. Brute-force device tuning (BFT) [41,58] and evolutionary algorithms [56] are applied to search MZI settings.…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…A detailed introduction to ONNs can be found in Appendix A. Beyond offline training [21], ONN on-chip training methods are proposed to offload the process back onto photonics [24,20,17], shown in Table 1. Brute-force device tuning (BFT) [41,58] and evolutionary algorithms [56] are applied to search MZI settings.…”
Section: Related Workmentioning
confidence: 99%
“…An adjoint variable method (AVM) [24] is proposed to directly evaluate gradients using in-situ light field monitoring. Stochastic zeroth-order optimization (ZOO) [20,17] is later applied to improve the training efficiency. However, prior methods are hard to scale to larger ONNs either due to algorithmic inefficiency or unrealistic hardware complexity.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations