2021 IEEE 37th International Conference on Data Engineering (ICDE) 2021
DOI: 10.1109/icde51399.2021.00103
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Construction of Nonlinear Models over Normalized Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 23 publications
0
3
0
Order By: Relevance
“…Recent advances [37], [38], [69] facilitate more automation and optimization opportunities by decomposing ML models to basic building blocks of linear algebra (LA) operators, i.e., developing rewriting rules for LA operators and then pushing them down to joinable tables. Existing solutions for factorization over joins [37], [70], [71] mainly tackle inner joins. Our main contribution is to expand the dataset relationships to more integration scenarios, with left joins, full outer joins, and unions, cf.…”
Section: Algebraic Computation Over Silosmentioning
confidence: 99%
See 2 more Smart Citations
“…Recent advances [37], [38], [69] facilitate more automation and optimization opportunities by decomposing ML models to basic building blocks of linear algebra (LA) operators, i.e., developing rewriting rules for LA operators and then pushing them down to joinable tables. Existing solutions for factorization over joins [37], [70], [71] mainly tackle inner joins. Our main contribution is to expand the dataset relationships to more integration scenarios, with left joins, full outer joins, and unions, cf.…”
Section: Algebraic Computation Over Silosmentioning
confidence: 99%
“…Another interesting direction is to support more ML models, e.g., transformers [72]. In existing works, non-linearity is studied with regard to feature interactions [70] and ML models, e.g., Gaussian Mixture Models, Neural Networks [71].…”
Section: A Computation Challenge: DI Metadata For Factorizationmentioning
confidence: 99%
See 1 more Smart Citation