2021
DOI: 10.48550/arxiv.2103.06709
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Hypervector Design for Efficient Hyperdimensional Computing on Edge Devices

Toygun Basaklar,
Yigit Tuncel,
Shruti Yadav Narayana
et al.

Abstract: Hyperdimensional computing (HDC) has emerged as a new lightweight learning algorithm with smaller computation and energy requirements compared to conventional techniques. In HDC, data points are represented by high-dimensional vectors (hypervectors), which are mapped to high-dimensional space (hyperspace). Typically, a large hypervector dimension (≥ 1000) is required to achieve accuracies comparable to conventional alternatives. However, unnecessarily large hypervectors increase hardware and energy costs, whic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 24 publications
0
4
0
Order By: Relevance
“…In contrast, HVs with a smaller dimensions may lose robustness but bring speed. Basaklar and co-workers 11 proposed to reduce dimensionality greatly from 8192 to 64 with little loss in accuracy. However, manual design of the HVs was required, which is obviously unsuitable for most alternate scenarios.…”
Section: Related Work and Motivationmentioning
confidence: 99%
“…In contrast, HVs with a smaller dimensions may lose robustness but bring speed. Basaklar and co-workers 11 proposed to reduce dimensionality greatly from 8192 to 64 with little loss in accuracy. However, manual design of the HVs was required, which is obviously unsuitable for most alternate scenarios.…”
Section: Related Work and Motivationmentioning
confidence: 99%
“…To evaluate the existing HDC models, for fair comparison, we also implement the SOTA HDC classifier [15] with 𝐷 = 8, 000 using our acceleration designs. Moreover, we choose two other lightweight models for comparison: a compressed HDC model [2] that uses a small vector but has non-binary weights and per-feature ValueBoxes founded using evolutionary search; and SFC-fix with FINN [35] that applies a 3-layer binary MLP on FPGA. For these models, we only report their results available for our considered datasets.…”
Section: Hardware Accelerationmentioning
confidence: 99%
“…In addition, the ultra-high dimension is another fundamental drawback of HDC models. Simply reducing the dimension without fundamentally changing the HDC design can dramatically decrease the inference accuracy A recent study [2] reduces the dimension but considers non-binary HDC that is unfriendly to hardware acceleration. Moreover, it uses different sets of value vectors for different features, and hence results in a large model size (Table 3).…”
Section: Related Workmentioning
confidence: 99%
“…and images [15][16][17] can be represented using HDC. Current research findings have demonstrated that HDC can achieve comparable performance with traditional machine learning techniques but support few-shot learning [18][19][20][21][22][23], high energy efficiency [24][25][26][27][28][29][30][31][32][33][34], and hardware acceleration [35][36][37]. HDC has wide applications that are not limited to supervised learning (e.g., classification [38][39][40] and regression [41]), unsupervised learning (e.g., clustering [42][43][44][45]), and even reasoning [46][47][48].…”
Section: Introductionmentioning
confidence: 99%