2021
DOI: 10.1017/s1351324921000206
|View full text |Cite
|
Sign up to set email alerts
|

Compositional matrix-space models of language: Definitions, properties, and learning methods

Abstract: We give an in-depth account of compositional matrix-space models (CMSMs), a type of generic models for natural language, wherein compositionality is realized via matrix multiplication. We argue for the structural plausibility of this model and show that it is able to cover and combine various common compositional natural language processing approaches. Then, we consider efficient task-specific learning methods for training CMSMs and evaluate their performance in compositionality prediction and sentiment analys… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 64 publications
(120 reference statements)
0
1
0
Order By: Relevance
“…However, some of these embeddings fail to capture the se-mantics (Jain, Kalo, Balke, Krestel 2021) and, in effect, a large amount of useless, false predictions might be generated. This major issue can possibly be remedied by novel embedding approaches (Abboud, Ceylan, Lukasiewicz, Salvatori 2020;Asaadi, Giesbrecht, Rudolph 2023).…”
Section: Common Coldmentioning
confidence: 99%
“…However, some of these embeddings fail to capture the se-mantics (Jain, Kalo, Balke, Krestel 2021) and, in effect, a large amount of useless, false predictions might be generated. This major issue can possibly be remedied by novel embedding approaches (Abboud, Ceylan, Lukasiewicz, Salvatori 2020;Asaadi, Giesbrecht, Rudolph 2023).…”
Section: Common Coldmentioning
confidence: 99%