2021
DOI: 10.1038/s41598-021-82239-8
|View full text |Cite
|
Sign up to set email alerts
|

Deep-learning-based high-resolution recognition of fractional-spatial-mode-encoded data for free-space optical communications

Abstract: Structured light with spatial degrees of freedom (DoF) is considered a potential solution to address the unprecedented demand for data traffic, but there is a limit to effectively improving the communication capacity by its integer quantization. We propose a data transmission system using fractional mode encoding and deep-learning decoding. Spatial modes of Bessel-Gaussian beams separated by fractional intervals are employed to represent 8-bit symbols. Data encoded by switching phase holograms is efficiently d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
14
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 30 publications
(14 citation statements)
references
References 53 publications
0
14
0
Order By: Relevance
“…As shown in Fig. 2 , fractional TCs give rise to local variations in the spatial field distribution, allowing a deep-learning model to discriminate them effectively despite the small mode interval 18 . Here, we assume 5 types of 10-OAM free-space optical links, where each of them has different mode spacing among .…”
Section: Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…As shown in Fig. 2 , fractional TCs give rise to local variations in the spatial field distribution, allowing a deep-learning model to discriminate them effectively despite the small mode interval 18 . Here, we assume 5 types of 10-OAM free-space optical links, where each of them has different mode spacing among .…”
Section: Methodsmentioning
confidence: 99%
“…To utilize BN effectively, we adopted the so-called pre-activation structure of BN-ReLU-Conv presented in modern network architectures such as residual network (ResNet) 40 and densely connected network (DenseNet) 41 . The ReLU is a nonlinear activation function of the form , whose purpose is to provide non-linearity to the output 18 . The ReLU is suitable for deep neural networks possessing many hidden layers because it can solve the gradient vanishing problem 39 .…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations