2023
DOI: 10.1021/acsanm.2c04094
|View full text |Cite
|
Sign up to set email alerts
|

Spin–Orbit Torque-Driven Memristor in L10 FePt Systems with Nanoscale-Thick Layers for Neuromorphic Computing

Abstract: In this study, a memristor driven by spin−orbit torque (SOT) is realized in the nanoscale thickness L1 0 FePt systems with high perpendicular magnetization anisotropy (PMA). Due to the domain nucleation and expansion driven by current pulses, multilevel Hall resistance states can be continuously tuned by current density, where the memristive states are retained by the domain wall pinning effects. The properties of multilevel resistance states for samples with different structures are associated with the magnit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

2
3

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 32 publications
0
6
0
Order By: Relevance
“…The network structure has three layers, as shown in Figure b. This neural network is based on the back propagation (BP) structure, which has strong nonlinear mapping ability and a trainable network structure and has been widely used in the field of artificial intelligence . The resistance state of the SOT synaptic device was used to update the synaptic weights between the hidden layer and the output layer in the network, and the weights were randomly assigned before training.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The network structure has three layers, as shown in Figure b. This neural network is based on the back propagation (BP) structure, which has strong nonlinear mapping ability and a trainable network structure and has been widely used in the field of artificial intelligence . The resistance state of the SOT synaptic device was used to update the synaptic weights between the hidden layer and the output layer in the network, and the weights were randomly assigned before training.…”
Section: Resultsmentioning
confidence: 99%
“…This neural network is based on the back propagation (BP) structure, which has strong nonlinear mapping ability and a trainable network structure and has been widely used in the field of artificial intelligence. 29 The resistance state of the SOT synaptic device was used to update the synaptic weights between the hidden layer and the output layer in the network, and the weights were randomly assigned before training. Training was carried out using the BP gradient-descent technique, and the synaptic weights were updated by mapping the ideal weights determined theoretically with the resistance state of our apparatus.…”
Section: = ± Hmentioning
confidence: 99%
“…Hereafter, this scenario is termed ideal . The second scenario was implemented by using an experimentally constructed synapse based on 150 R Hall states (Figure d–f) for synaptic weight , and fitted sigmoidal functions (Figure b) as the hidden layer activation function . The second scenario is experimental and therefore termed exp .…”
mentioning
confidence: 99%
“…This can be attributed to massive parallel processing and high energy efficiency . Several reports have suggested that SOT-based devices could be used as artificial synapses for NC. ,, Yadav et al experimentally constructed LTD/LTP responses for the Pt/Co/SiO 2 SOT system and utilized it as an artificial synapse for NC . Kurenkov et al realized an artificial neuron and synapse in an AFM/FM-based SOT device for NC .…”
mentioning
confidence: 99%
See 1 more Smart Citation