2017
DOI: 10.1007/978-3-319-70682-5_8
|View full text |Cite
|
Sign up to set email alerts
|

Resolving Range Violations in DBpedia

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 12 publications
(12 citation statements)
references
References 15 publications
0
12
0
Order By: Relevance
“…These methods ensure KB consistency, but they can neither correct the knowledge itself nor deal with those wrong assertions that satisfy the constraints. Lertvittayakumjorn et al [20] and Melo et al [21] both correct assertions by replacing the objects or subjects with correct entities. The former found the substitute by either keyword matching or a simple graph structure metric, while the latter first retrieved candidate substitutes from the Wikipedia disambiguation page (which may not exist, especially for KBs that are not based on Wikipedia) and then ranked them by lexical similarity.…”
Section: Assertion Correctionmentioning
confidence: 99%
See 1 more Smart Citation
“…These methods ensure KB consistency, but they can neither correct the knowledge itself nor deal with those wrong assertions that satisfy the constraints. Lertvittayakumjorn et al [20] and Melo et al [21] both correct assertions by replacing the objects or subjects with correct entities. The former found the substitute by either keyword matching or a simple graph structure metric, while the latter first retrieved candidate substitutes from the Wikipedia disambiguation page (which may not exist, especially for KBs that are not based on Wikipedia) and then ranked them by lexical similarity.…”
Section: Assertion Correctionmentioning
confidence: 99%
“…Lertvittayakumjorn et al [20] and Melo et al [21] found that most erroneous assertions are due to confusion or lexical similarity leading to entity misuse; for example confusion between Manchester_United and Manchester_City, two football clubs based in Manchester, UK, can lead to facts about Manchester_United being incorrectly asserted about Manchester_City. Such errors are common not only in general KBs like DBpedia and Wikidata but also in domain KBs like the medical KB used in our evaluation.…”
Section: Introductionmentioning
confidence: 99%
“…Knowledge Graph assessment can differ along different quality dimensions (Batini et al 2009;Zaveri et al 2013). For example, Paulheim and Bizer (2013) tackle the identification of missing instance assertions, Fürber and Hepp (2010a) identify wrong and missing property value assertions, and Lertvittayakumjorn et al (2017) address the identification of wrong property value assertions. The approach presented by Mendes et al (2012) defines the additional quality assessment methods.…”
Section: Knowledge Assessmentmentioning
confidence: 99%
“…We select to maintain DBpedia with our platform because DBpedia is widely used in the knowledge graph community. However, DBpedia contains many erroneous triples due to the extraction process and incompleteness of infoboxes [8]. We added a knowledge graph refinement module [8] on UWKGM to clean and maintain DBpedia.…”
Section: Use Casesmentioning
confidence: 99%
“…However, DBpedia contains many erroneous triples due to the extraction process and incompleteness of infoboxes [8]. We added a knowledge graph refinement module [8] on UWKGM to clean and maintain DBpedia. In the future, we could integrate new release DBpedia to our modified DBpedia by comparing data under version control of UWKGM.…”
Section: Use Casesmentioning
confidence: 99%