2021
DOI: 10.1002/adpr.202100048
|View full text |Cite
|
Sign up to set email alerts
|

Artificial Intelligence Accelerators Based on Graphene Optoelectronic Devices

Abstract: Optical and optoelectronic approaches of performing matrix–vector multiplication (MVM) operations have shown the great promise of accelerating machine learning (ML) algorithms with unprecedented performance. The incorporation of nanomaterials into the system can further improve the device and system performance thanks to their extraordinary properties, but the nonuniformity and variation of nanostructures in the macroscopic scale pose severe limitations for large‐scale hardware deployment. Here, a new optoelec… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
15
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
3

Relationship

6
3

Authors

Journals

citations
Cited by 17 publications
(15 citation statements)
references
References 53 publications
0
15
0
Order By: Relevance
“…One critical step towards building practical ONNs with high overall energy efficiency is to design a full-scale optical matrix-vector multiplier with optical fan-out and fan-in (Supplementary Notes 9 and 10 ), integrated with fast and highly efficient modulators 43 and detectors 44 . While the 2D-block matrix-vector multiplier used in this work is not the architecture most closely matched to incorporating integrated-photonics modules in the short term, it may serve as a viable platform for image-processing tasks involving incoherent light sources, which are common in biomedical imaging and robotics 45 .…”
Section: Discussionmentioning
confidence: 99%
“…One critical step towards building practical ONNs with high overall energy efficiency is to design a full-scale optical matrix-vector multiplier with optical fan-out and fan-in (Supplementary Notes 9 and 10 ), integrated with fast and highly efficient modulators 43 and detectors 44 . While the 2D-block matrix-vector multiplier used in this work is not the architecture most closely matched to incorporating integrated-photonics modules in the short term, it may serve as a viable platform for image-processing tasks involving incoherent light sources, which are common in biomedical imaging and robotics 45 .…”
Section: Discussionmentioning
confidence: 99%
“…The generated photocurrent from each photodetector in the array, I i , is proportional to Σ j v j w ij ; see Ref. 23 . In order to implement negative values based on non-negative physical quantities, such as transmittance and and photocurrent, each v i and w ij are represented as a difference of two positive values 23,24 ; see Methods for more details.…”
Section: Resultsmentioning
confidence: 99%
“…Clearly, the increase of the reward, thus the decrease of O-GEMM calculation error, is synchronized with the increase of T max and T diff and the decrease of total thickness, although the only optimization target in the algorithm is the reward from O-GEMM calculation accuracy. From the physical point of view, the large T max and T diff in the SRA indicate a strong robustness against the noise from detector and finite-bit-quantization 23 .…”
Section: Resultsmentioning
confidence: 99%
“…Recently, photonic neuromorphic processors are emerging as high-performance hardware ML accelerators by leveraging fundamentally different particles, photons, to break electronic bottleneck thanks to the extreme parallelism from the weak interaction and multiplexing of photons as well as low static power consumption [12][13][14][15][16][17][18] . Particularly, three-dimensional (3D) free-space diffractive optical neural networks, which exploit the out-of-plane dimension for light routing and can potentially host millions of compact active devices and computing neurons in deep network architectures, physically perform the multiplication and addition operations central to ML algorithms through spatial light modulation and diffraction, and demonstrate their capability of performing image classification 13,15 .…”
Section: Introductionmentioning
confidence: 99%