On the basis of recent research,
brain-inspired parallel computing
is considered as one of the most promising technologies for efficiently
handling large amounts of informational data. In general, this type
of parallel computing is called neuromorphic computing; it operates
on the basis of hardware-neural-network (HW-NN) platforms consisting
of numerous artificial synapses and neurons. Extensive research has
been conducted to implement artificial synapses with characteristics
required to ensure high-level performance of HW-NNs in terms of device
density, energy efficiency, and learnings accuracy. Recently, artificial
synapsesspecifically, diode- and transistor-type synapsesbased
on various two-dimensional (2D) van der Waals (vdW) materials have
been developed. Unique properties of such 2D vdW materials allow for
notable improvements in synaptic performances in terms of learning
capability, scalability, and power efficiency, thereby highlighting
the feasibility of the 2D vdW synapses in improving the performance
of HW-NNs. In this review, we introduce the desirable characteristics
of artificial synapses required to ensure high-level performance of
neural networks. Recent progress in research on artificial synapses,
fabricated particularly using 2D vdW materials and heterostructures,
is comprehensively discussed with respect to the weight-update mechanism,
synaptic characteristics, power efficiency, and scalability.