2018
DOI: 10.1007/s10579-018-9417-z
|View full text |Cite
|
Sign up to set email alerts
|

COVER: a linguistic resource combining common sense and lexicographic information

Abstract: Lexical resources are fundamental to tackle many tasks that are central to present and prospective research in Text Mining, Information Retrieval, and connected to Natural Language Processing. In this article we introduce COVER, a novel lexical resource, along with COVERAGE, the algorithm devised to build it. In order to describe concepts, COVER proposes a compact vectorial representation that combines the lexicographic precision characterizing BabelNet and the rich common-sense knowledge featuring Con-ceptNet… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
2

Relationship

4
2

Authors

Journals

citations
Cited by 12 publications
(12 citation statements)
references
References 57 publications
0
12
0
Order By: Relevance
“…common features are the elements at the intersection of the two features sets, whilst distinctive features are owned by only one of the considered entities. This notion of similarity has been recently adjusted to model human similarity judgments for short texts (the Symmetrical Tversky Ratio Model [61]), and employed to compute semantic similarity of word pairs [62,63]. Another approach to similarity builds on the distinction between attributes and relations [64].…”
Section: Cognitively Plausible Semantic Distance Metricsmentioning
confidence: 99%
“…common features are the elements at the intersection of the two features sets, whilst distinctive features are owned by only one of the considered entities. This notion of similarity has been recently adjusted to model human similarity judgments for short texts (the Symmetrical Tversky Ratio Model [61]), and employed to compute semantic similarity of word pairs [62,63]. Another approach to similarity builds on the distinction between attributes and relations [64].…”
Section: Cognitively Plausible Semantic Distance Metricsmentioning
confidence: 99%
“…In particular, the nodes in the network represent concepts and entities (that is, persons, organisations and locations), and the edges intervening between each two nodes represent semantic relations (such as IsA, PartOf, etc.). Although further lexical resource exist containing different sorts of knowledge (such as, e.g., WordNet [27], ConceptNet [23], COVER [20,26], or a hybrid approach proposed by [12,21]), we chose to adopt BabelNet in that it ensures a broad coverage to concepts and entities as well, that in the present domain are particularly relevant. The semantic module aims at searching the terms present in the theses title, to individuate the underlying concept and then at checking whether they are either philosophical concepts (that is, linked to 'philosophy' in the BabelNet taxonomy) or philosophers.…”
Section: The Semantic Modulementioning
confidence: 99%
“…In machine learning, decision trees [4] and sparse linear models [5] are popular examples of techniques that produce interpretable models. Also in the AI field, some sort of lexical resources have been employed to assist in the construction of the explanation of semantic similarity ratings between word pairs [6,7]. Many sorts of explanation can be drawn, responding to diverse needs underlying the general aim at providing more transparency to algorithms and systems.…”
Section: Introductionmentioning
confidence: 99%