2023
DOI: 10.1002/inf2.12416
|View full text |Cite
|
Sign up to set email alerts
|

Self‐selective memristor‐enabled in‐memory search for highly efficient data mining

Abstract: Similarity search, that is, finding similar items in massive data, is a fundamental computing problem in many fields such as data mining and information retrieval. However, for large‐scale and high‐dimension data, it suffers from high computational complexity, requiring tremendous computation resources. Here, based on the low‐power self‐selective memristors, for the first time, we propose an in‐memory search (IMS) system with two innovative designs. First, by exploiting the natural distribution law of the devi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 12 publications
(5 citation statements)
references
References 35 publications
0
5
0
Order By: Relevance
“…The randomness of the memristive conductance has been considered as a non-ideal property for representing the weights of the neural networks that are off-line trained in full precision. Instead of being mitigated or compensated, the D2D variation in the MCA provides a natural source of randomness for implementing the random weight matrix of the reservoir layer, which has also been used for security primitives [34][35][36], locality-sensitive hashing [24,29] and echo state graph neural networks [25].…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The randomness of the memristive conductance has been considered as a non-ideal property for representing the weights of the neural networks that are off-line trained in full precision. Instead of being mitigated or compensated, the D2D variation in the MCA provides a natural source of randomness for implementing the random weight matrix of the reservoir layer, which has also been used for security primitives [34][35][36], locality-sensitive hashing [24,29] and echo state graph neural networks [25].…”
Section: Resultsmentioning
confidence: 99%
“…Despite the promise, one of the main challenges for MCAs as neural network accelerators executing linear weighted summation in one step is the intrinsic device-to-device (D2D) variation [23]. Rather than trying to satisfy the demands of the mainstream deep learning, researchers have gradually become aware of the great opportunities of exploiting the 'non-ideal' behaviors of memristors for unconventional computing [10,[24][25][26][27][28][29]. Wang et al [25] used a nonvolatile MCA with intrinsic D2D variation for implementing the random projection in RC.…”
Section: Introductionmentioning
confidence: 99%
“…The search energy, search speed, and cell area are the three most critical factors for measuring the performance of TCAM cells. [127][128][129][130][131] With the advantages of low power consumption and higher integration density of SRM devices than other types of memristors, the SRM-based TCAM cell demonstrated substantial superiority, mainly in terms of the search energy and cell area. [85,101,104,132,133] Unlike conventional transistor-based TCAM structures, the SRM-based TCAM features a simple 2R structure resulting from its self-rectifying nature, which generates a higher area efficiency.…”
Section: Similarity Searchmentioning
confidence: 99%
“…[104] Recently, Yang et al proposed a nonlinear device (V/VO x /HfO x /Pt) based on the interfacial spontaneous oxidation process of the V electrode and designed an ultralow-power 2R TCAM unit (Figure 8c). [127] Based on this TCAM, an in-memory search system with a high energy effi-ciency was experimentally demonstrated. All these studies prove that SRM is a promising candidate for exploring the search energy and cell area limits of TCAM cells.…”
Section: Similarity Searchmentioning
confidence: 99%
See 1 more Smart Citation