2017
DOI: 10.1109/tnnls.2016.2535338
|View full text |Cite
|
Sign up to set email alerts
|

Holographic Graph Neuron: A Bioinspired Architecture for Pattern Processing

Abstract: Abstract-This article proposes the use of Vector Symbolic Architectures for implementing Hierarchical Graph Neuron, an architecture for memorizing patterns of generic sensor stimuli. The adoption of a Vector Symbolic representation ensures a one-layered design for the approach, while maintaining the previously reported properties and performance characteristics of Hierarchical Graph Neuron, and also improving the noise resistance of the architecture. The proposed architecture enables a linear (with respect to … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
29
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 63 publications
(29 citation statements)
references
References 29 publications
0
29
0
Order By: Relevance
“…Based on the use of these MAP operations, an encoder can be designed for various tasks, e.g., EMG [20], [22], [49], EEG [23], [50], ECoG [51], ExG [45], or in general pattern processing [52]. The encoder emits a hypervector representing the event of interest that is then fed into an associative memory (AM) for training and inference.…”
Section: B Hyperdimensional Computingmentioning
confidence: 99%
“…Based on the use of these MAP operations, an encoder can be designed for various tasks, e.g., EMG [20], [22], [49], EEG [23], [50], ECoG [51], ExG [45], or in general pattern processing [52]. The encoder emits a hypervector representing the event of interest that is then fed into an associative memory (AM) for training and inference.…”
Section: B Hyperdimensional Computingmentioning
confidence: 99%
“…In [90], the formation of sparse memory vectors (with an additional operation of context-dependent thinning [134]) is considered, and in [91] the probability of correct recognition is estimated. The use of graded connections (the formation of the memory vector is done by addition), including subsequent binarization, and the classification problem for vectors not from the base, are considered in [89,91].…”
Section: The Generalization Of Krotov-hopfieldmentioning
confidence: 99%
“…Note that such data representation schemes by similarity preserving binary vectors have been developed for objects represented by various data types, mainly for (feature) vectors (see survey in [131]), but also for structured data types such as sequences [102,72,85,86] and graphs [127,128,148,136,62,134]. A significant part of this research is developed in the framework of distributed representations [45,76,106,126,89], including binary sparse distributed representations [102,98,103,127,128,113,114,137,138,139,148,135,136,61,134,129,130,131,132,31,33] and dense distributed representations [75,76] (see [82,84,87,88,83] for examples of their applications).…”
Section: Generalization In Namsmentioning
confidence: 99%