Proceedings of the International Conference on Web Intelligence 2017
DOI: 10.1145/3106426.3106463
|View full text |Cite
|
Sign up to set email alerts
|

Keeping linked open data caches up-to-date by predicting the life-time of RDF triples

Abstract: Many Linked Open Data applications require fresh copies of RDF data at their local repositories. Since RDF documents constantly change and those changes are not automatically propagated to the LOD applications, it is important to regularly visit the RDF documents to refresh the local copies and keep them up-to-date. For this purpose, crawling strategies determine which RDF documents should be preferentially fetched. Traditional crawling strategies rely only on how an RDF document has been modified in the past.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 16 publications
0
4
0
Order By: Relevance
“…However, they do not use version chains and the additional step of conversion makes it hardly comparable to versioned, human-engineered ontologies. [24] introduce a new strategy for updating RDF links by predicting triple-level changes, and using this predictions to identify what RDF documents to update. However, their focus is not on the change prediction but rather on the update which follows.…”
Section: Related Workmentioning
confidence: 99%
“…However, they do not use version chains and the additional step of conversion makes it hardly comparable to versioned, human-engineered ontologies. [24] introduce a new strategy for updating RDF links by predicting triple-level changes, and using this predictions to identify what RDF documents to update. However, their focus is not on the change prediction but rather on the update which follows.…”
Section: Related Workmentioning
confidence: 99%
“…We hide the maintenance process from incoming queries to improve the performance of response time. We use CAMP [12,33] to determine when a local view needs to be updated. CAMP identifies the highly dynamic sources in the Linked Data that are frequently updated and captures these changes to updates the local views.…”
Section: Scheduling Updates Of Linked Datamentioning
confidence: 99%
“…Analytics applications need constant updates to guarantee the quality of service in maintaining the local cache. To the extent of our knowledge, there is limited work addressing the problem of maintaining local views (or caches) up-to-date [6,[10][11][12][13][14][15][16]. Normally a maintenance policy is needed to determine what to update and when.…”
Section: Introductionmentioning
confidence: 99%
“…Since its theoretical definition in 2001, the Semantic Web has evolved, built upon existing technology and even boosted many advances for data processing over the Web. Specially, the advent of Linked Open Data (LOD) has spurred interest over representations of general world knowledge as graphs from completely fresh perspectives, for example [34]. Technically, Linked Data refers to data published on the Web in such a way that it is machine-readable, explicitly defined, linked to other external data sets, and can be linked to/from external data sets [8].…”
Section: Introductionmentioning
confidence: 99%