Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) 2016
DOI: 10.18653/v1/p16-1121
|View full text |Cite
|
Sign up to set email alerts
|

Learning Semantically and Additively Compositional Distributional Representations

Abstract: This paper connects a vector-based composition model to a formal semantics, the Dependency-based Compositional Semantics (DCS). We show theoretical evidence that the vector compositions in our model conform to the logic of DCS. Experimentally, we show that vector-based composition brings a strong ability to calculate similar phrases as similar vectors, achieving near state-of-the-art on a wide range of phrase similarity tasks and relation classification; meanwhile, DCS can guide building vectors for structured… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

1
14
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 13 publications
(15 citation statements)
references
References 44 publications
1
14
0
Order By: Relevance
“…In order to learn parameters u h , v t , M r of the score function, we follow Tian et al (2016) using a Noise Contrastive Estimation (NCE) (Gutmann and Hyvärinen, 2012) objective. For each path (or triple) h, r 1 / .…”
Section: Base Modelmentioning
confidence: 99%
“…In order to learn parameters u h , v t , M r of the score function, we follow Tian et al (2016) using a Noise Contrastive Estimation (NCE) (Gutmann and Hyvärinen, 2012) objective. For each path (or triple) h, r 1 / .…”
Section: Base Modelmentioning
confidence: 99%
“…Since distributed representations play an important role in various NLP tasks, they are applied to semantics (Herbelot and Vecchi, 2015;Qiu et al, 2015;Woodsend and Lapata, 2015), with incorporating external information to them (Tian et al, 2016;Nguyen et al, 2016). In addition, finding interpretable regularities from the representations is often conducted through non-negative and sparse coding (Murphy et al, 2012;Faruqui et al, 2015;Luo et al, 2015;Kober et al, 2016), and regularization (Sun et al, 2016).…”
Section: Related Workmentioning
confidence: 99%
“…In contrast, our proposal of the Near-far Context demonstrates that word order can be handled within an additive compositional framework, being parameterfree and with a proven bias bound. Recently, Tian et al (2016) further extended additive composition to realizing a formal semantics framework.…”
Section: Related Workmentioning
confidence: 99%