2010
DOI: 10.14778/1920841.1920859
|View full text |Cite
|
Sign up to set email alerts
|

Scalable data exchange with functional dependencies

Abstract: The recent literature has provided a solid theoretical foundation for the use of schema mappings in data-exchange applications. Following this formalization, new algorithms have been developed to generate optimal solutions for mapping scenarios in a highly scalable way, by relying on SQL. However, these algorithms suffer from a serious drawback: they are not able to handle key constraints and functional dependencies on the target, i.e., equality generating dependencies (egds). While egds play a crucial role in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
30
0
1

Year Published

2011
2011
2020
2020

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 34 publications
(32 citation statements)
references
References 22 publications
1
30
0
1
Order By: Relevance
“…If there were a core-schema mapping for M, i.e., a set of FO-rules to compute J 0 , then it would be possible to compute the transitive closure of A using first-order logic, which is obviously a contradiction. With respect to target egds, a similar result was recently proven in [23], where the authors show that it is not possible in general to rewrite a set of s-t tgds and target egds as an equivalent set of FO-rules. Nevertheless, the techniques developed in this paper represent an important building block towards the goal of developing scalable core-computation techniques for large classes of scenarios that include target constraints, as follows.…”
Section: Target Constraintssupporting
confidence: 52%
See 2 more Smart Citations
“…If there were a core-schema mapping for M, i.e., a set of FO-rules to compute J 0 , then it would be possible to compute the transitive closure of A using first-order logic, which is obviously a contradiction. With respect to target egds, a similar result was recently proven in [23], where the authors show that it is not possible in general to rewrite a set of s-t tgds and target egds as an equivalent set of FO-rules. Nevertheless, the techniques developed in this paper represent an important building block towards the goal of developing scalable core-computation techniques for large classes of scenarios that include target constraints, as follows.…”
Section: Target Constraintssupporting
confidence: 52%
“…Handling key constraints is a delicate task, due to the particular form of processing that they require on the target instances -essentially equating values. However, in [23] it was shown that, for a very large fraction of cases, it is possible to rewrite a mapping scenario containing s-t tgds and target egds as a set of FO-rules, and then use these rules to compute core universal solutions for the original scenario. The results in [23] heavily rely on the algorithms developed in this paper in order to perform the rewriting, thus confirming the relevance of our contributions.…”
Section: Target Constraintsmentioning
confidence: 99%
See 1 more Smart Citation
“…Data exchange based on schema mapping was introducedby the Clio project [2], [3], which creates transformation scripts by using schema mapping. The data exchange problem has also been studied by Fagin etal., who propose theoretical foundations behind data exchange [4].The target solution generated by a system is pruned and processed to generate the core solution through introduced the concept of universal solution in the post-processing approach to compute the core solution [6], [7].As argued in [8], this technique may result in highredundancies that consequently impairs efficiency of a data exchange system. On the other hand, in preprocessingapproaches such as ++Spicy [8], schemamapping expressions are directly generate the core solution and it refined the mappings.In spite of considerable improvements in data exchange, there exist scenarios that cannot be handledproperly.…”
Section: Introductionmentioning
confidence: 99%
“…The data exchange problem has also been studied by Fagin etal., who propose theoretical foundations behind data exchange [4].The target solution generated by a system is pruned and processed to generate the core solution through introduced the concept of universal solution in the post-processing approach to compute the core solution [6], [7].As argued in [8], this technique may result in highredundancies that consequently impairs efficiency of a data exchange system. On the other hand, in preprocessingapproaches such as ++Spicy [8], schemamapping expressions are directly generate the core solution and it refined the mappings.In spite of considerable improvements in data exchange, there exist scenarios that cannot be handledproperly. Morespecifically, in many existing data exchange systemsbased on schema mapping (e.g., [2]), first mappingsare created based on schema level information, andthen these mappings are used to translate source to the target data.…”
Section: Introductionmentioning
confidence: 99%