2015
DOI: 10.48550/arxiv.1502.00831
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Open System Categorical Quantum Semantics in Natural Language Processing

Abstract: Originally inspired by categorical quantum mechanics (Abramsky and Coecke, LiCS'04), the categorical compositional distributional model of natural language meaning of Coecke, Sadrzadeh and Clark provides a conceptually motivated procedure to compute the meaning of a sentence, given its grammatical structure within a Lambek pregroup and a vectorial representation of the meaning of its parts. The predictions of this first model have outperformed that of other models in mainstream empirical language processing ta… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
27
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
5
2
1

Relationship

3
5

Authors

Journals

citations
Cited by 18 publications
(27 citation statements)
references
References 23 publications
0
27
0
Order By: Relevance
“…This work also explores two closely related theories from the literature, those of double dilation and double mixing, developed to describe quantum-like aspects of ambiguity in natural language processing [4,17]. It was originally believed [13] that density hypercubes and double dilation coincided: we show that not to be the case.…”
Section: Introductionmentioning
confidence: 81%
See 1 more Smart Citation
“…This work also explores two closely related theories from the literature, those of double dilation and double mixing, developed to describe quantum-like aspects of ambiguity in natural language processing [4,17]. It was originally believed [13] that density hypercubes and double dilation coincided: we show that not to be the case.…”
Section: Introductionmentioning
confidence: 81%
“…that all maps in the form (4) can also be put in the form (3). Both double dilation and double mixing have found application to the modelling of ambiguity in natural language processing [4,17].…”
Section: Double Dilation Double Mixing and Density Hypercubesmentioning
confidence: 99%
“…vector embeddings in machine learning) within one compositional framework that enables one to compute the meaning of a sentence from the meanings of its words. To achieve this it exploits the common compact closed categorical structure, be it of vectors and linear maps, or of positive operators and completely positive maps [2,31]. The DisCoCirc framework [5] improved on DisCoCat, (1) by enabling one to compose sentences into larger text, just as gates are composed in circuits; (2) by allowing meanings to evolve as that text evolves; (3) by specifying the sentence type as the tensored spaces of those entities that evolve.…”
Section: Compositional Language Meaning 21 Discocircmentioning
confidence: 99%
“…we recover the folding of density hypercubes [21,23], double dilation [37] and double mixing [2,9,29]. In all three cases the environment structure contains the caps from the original CPM construction (i.e.…”
Section: Definition 4 (Environment Structurementioning
confidence: 99%
“…This generalisation expands upon the inherent symmetries of the original CPM construction to produce categories with more exotic properties, captured by an essential invariance under the action of a group of monoidal autofunctors. These categories have found uses in quantum natural language processing, where they capture multiple degrees of linguistic ambiguity [2,29,9,37], and in CQM, where they have been found to exhibit higher-order interference, hyper-decoherence and a rich phase group structure [21,23].…”
Section: Introductionmentioning
confidence: 99%