2024
DOI: 10.1002/aelm.202300850
|View full text |Cite
|
Sign up to set email alerts
|

Random Projection‐Based Locality‐Sensitive Hashing in a Memristor Crossbar Array with Stochasticity for Sparse Self‐Attention‐Based Transformer

Xinxin Wang,
Ilia Valov,
Huanglong Li

Abstract: Self‐attention mechanism is critically central to the state‐of‐the‐art transformer models. Because the standard full self‐attention has quadratic complexity with respect to the input's length L, resulting in prohibitively large memory for very long sequences, sparse self‐attention enabled by random projection (RP)‐based locality‐sensitive hashing (LSH) has recently been proposed to reduce the complexity to O(L log L). However, in current digital computing hardware with a von Neumann architecture, RP, which is … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 33 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?