Neuromorphic computing, an innovative technology inspired
by the
human brain, has attracted increasing attention as a promising technology
for the development of artificial intelligence systems. This study
proposes synaptic transistors with a Li1–x
Al
x
Ti2–x
(PO4)3 (LATP) layer to analyze the conductance
modulation linearity, which is essential for weight mapping and updating
during on-chip learning processes. The high ionic conductivity of
the LATP electrolyte provides a large hysteresis window and enables
linear weight update in synaptic devices. The results demonstrate
that optimizing the LATP layer thickness improves the conductance
modulation and linearity of synaptic transistors during potentiation
and degradation. A 20 nm-thick LATP layer results in the most nonlinear
depression (αd = −6.59), whereas a 100 nm-thick
LATP layer results in the smallest nonlinearity (αd = −2.22). Additionally, a device with the optimal 100 nm-thick
LATP layer exhibits the highest average recognition accuracy of 94.8%
and the smallest fluctuation, indicating that the linearity characteristics
of a device play a crucial role in weight update during learning and
can significantly affect the recognition accuracy.