2021
DOI: 10.1186/s40708-021-00138-0
|View full text |Cite
|
Sign up to set email alerts
|

Near-channel classifier: symbiotic communication and classification in high-dimensional space

Abstract: Brain-inspired high-dimensional (HD) computing represents and manipulates data using very long, random vectors with dimensionality in the thousands. This representation provides great robustness for various classification tasks where classifiers operate at low signal-to-noise ratio (SNR) conditions. Similarly, hyperdimensional modulation (HDM) leverages the robustness of complex-valued HD representations to reliably transmit information over a wireless channel, achieving a similar SNR gain compared to state-of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3

Relationship

3
4

Authors

Journals

citations
Cited by 12 publications
(11 citation statements)
references
References 44 publications
0
11
0
Order By: Relevance
“…The M encoders at the left compute the different query hypervectors, which will be bundled later on through the majority operation. Each encoder can encode data from e.g., different sensory modalities [26], [27], or streaming channels [18]. This is highly desirable since by doing a bundling of M queries, we virtually increase the throughput by a factor of M .…”
Section: Towards Wireless-enabled Scale-out Hdc Architecturesmentioning
confidence: 99%
See 2 more Smart Citations
“…The M encoders at the left compute the different query hypervectors, which will be bundled later on through the majority operation. Each encoder can encode data from e.g., different sensory modalities [26], [27], or streaming channels [18]. This is highly desirable since by doing a bundling of M queries, we virtually increase the throughput by a factor of M .…”
Section: Towards Wireless-enabled Scale-out Hdc Architecturesmentioning
confidence: 99%
“…By its very nature, HDC is extremely robust in the presence of failures, defects, variations, and noise, all of which are synonymous to ultra-low energy computation. It has been shown that HDC degrades very gracefully in the presence of various faults compared to baseline classifiers: HDC tolerates intermittent errors [29], permanent hard errors (in memory [30] and logic [31]), and spatio-temporal variations [32] in emerging technologies as well as noise and interference in the communication channels [15], [18]. These demonstrate robust operations of HDC under low signal-tonoise ratio and high variability conditions.…”
Section: Introductionmentioning
confidence: 96%
See 1 more Smart Citation
“…Some of these results can be obtained analytically using the capacity theory. Additionally, [Summers-Stay et al, 2018], [Frady et al, 2018b], [Kim, 2018], [Hersche et al, 2021] elaborated on methods for recovering information from compositional HVs beyond the standard nearest neighbor search in the item memory, reaching to the capacity of up to 1.2 bits/component [Hersche et al, 2021]. The works above were focused on the case where a single HV was used to store information but as it was demonstrated in [Danihelka et al, 2016] the decoding from HVs can be improved if the redundant storage is used.…”
Section: Information Capacity Of Hvsmentioning
confidence: 99%
“…HDC has been employed in a range of applications including cognitive computing [3], [4], robotics [5], distributed computing [6]- [8], communications [9]- [14], and in various aspects of machine learning. See [15] for a comprehensive review.…”
Section: Introductionmentioning
confidence: 99%