2012
DOI: 10.14704/nq.2012.10.2.507
|View full text |Cite
|
Sign up to set email alerts
|

Quantum Computation via Sparse Distributed Representation

Abstract: Quantum superposition states that any physical system simultaneously exists in all of its possible states, the number of which is exponential in the number of entities composing the system. The strength of presence of each possible state in the superposition-i.e., the probability with which it would be observed if measured-is represented by its probability amplitude coefficient. The assumption that these coefficients must be represented physically disjointly from each other, i.e., localistically, is nearly uni… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Year Published

2015
2015
2018
2018

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 7 publications
0
4
0
Order By: Relevance
“…Such models include holographic reduced representations (HRR; Plate, 1991Plate, , 2003 and hyperdimensional computing (HDC) (Gayler, 1998;Kanerva, 2009), and will be referred to here by the umbrella term vector symbolic architectures (VSA; see Gayler, 2003; section 4.1.1). VSA models have been shown to be able to solve challenging tasks of cognitive reasoning (Rinkus, 2012;Kleyko & Osipov, 2014;Gayler, 2003). VSA principles have been recently incorporated into standard neural networks for advanced machine learning tasks (Eliasmith et al, 2012), inductive reasoning (Rasmussen & Eliasmith, 2011), and processing of temporal structure (Graves, Wayne, & Danihelka, 2014;Graves et al, 2016;Danihelka, Wayne, Uria, Kalchbrenner, & Graves, 2016).…”
Section: Introductionmentioning
confidence: 99%
“…Such models include holographic reduced representations (HRR; Plate, 1991Plate, , 2003 and hyperdimensional computing (HDC) (Gayler, 1998;Kanerva, 2009), and will be referred to here by the umbrella term vector symbolic architectures (VSA; see Gayler, 2003; section 4.1.1). VSA models have been shown to be able to solve challenging tasks of cognitive reasoning (Rinkus, 2012;Kleyko & Osipov, 2014;Gayler, 2003). VSA principles have been recently incorporated into standard neural networks for advanced machine learning tasks (Eliasmith et al, 2012), inductive reasoning (Rasmussen & Eliasmith, 2011), and processing of temporal structure (Graves, Wayne, & Danihelka, 2014;Graves et al, 2016;Danihelka, Wayne, Uria, Kalchbrenner, & Graves, 2016).…”
Section: Introductionmentioning
confidence: 99%
“…Fairly recently, ideas have emerged-in both cognitive science and computer science-that certain distributed representations allow for information processing that is mathematically analogous to quantum computing (Aerts et al 2009), or might even allow for classical computing with quantum power (Rinkus 2012). Returning in these approaches is that good candidates for that are so-called holographic reduced representations (Plate 1991).…”
Section: Classical Computing With Quantum Powermentioning
confidence: 99%
“…The idea that the brain represents information in a reduced, distributed, fashion has been around for a while (Hinton 1990). Fairly recently, ideas have emerged -in both cognitive science and computer science -that certain distributed representations allow for information processing that is mathematically analogous to quantum computing (Aerts et al 2009), or might even allow for classical computing with quantum power (Rinkus 2012).…”
Section: Classical Computing With Quantum Powermentioning
confidence: 99%
“…Some approaches and methods require specialized algorithms for storage and handling of vector data in a certain format (representation). Sparse [28,29] (with a small share of nonzero components) binary vectors are used, for example, in associative-projective neural networks [30,31] and in the efficient binary version [32][33][34] of distributed associative memory [10,11,[35][36][37][38].…”
Section: Introductionmentioning
confidence: 99%