2020
DOI: 10.1109/access.2020.3001765
|View full text |Cite
|
Sign up to set email alerts
|

Semantic Similarity Estimation Using Vector Symbolic Architectures

Abstract: For many natural language processing applications, estimating similarity and relatedness between words are key tasks that serve as the basis for classification and generalization. Currently, vector semantic models (VSM) have become a fundamental language modeling tool. VSMs represent words as points in a high-dimensional space and follow the distributional hypothesis of meaning, which assumes that semantic similarity is related to the context. In this paper, we propose a model whose representations are based o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 52 publications
0
1
0
Order By: Relevance
“…For placing the methods in the general context of word embeddings, please refer to [419]. [271] aimed to address the representation of similarity, rather than relatedness represented in context HVs by BEAGLE and RI. To do so, for each word the authors represented its most relevant semantic features taken from a knowledge base ConceptNet.…”
Section: Bound Encoding Of the Aggregate Language Environmentmentioning
confidence: 99%
“…For placing the methods in the general context of word embeddings, please refer to [419]. [271] aimed to address the representation of similarity, rather than relatedness represented in context HVs by BEAGLE and RI. To do so, for each word the authors represented its most relevant semantic features taken from a knowledge base ConceptNet.…”
Section: Bound Encoding Of the Aggregate Language Environmentmentioning
confidence: 99%