2019
DOI: 10.1007/s00365-019-09461-6
|View full text |Cite
|
Sign up to set email alerts
|

Learning General Sparse Additive Models from Point Queries in High Dimensions

Abstract: We consider the problem of learning a d-variate function f defined on the cube [−1, 1] d ⊂ R d , where the algorithm is assumed to have black box access to samples of f within this domain. Denote S r ⊂ [d] r ; r = 1, . . . , r 0 to be sets consisting of unknown r-wise interactions amongst the coordinate variables. We then focus on the setting where f has an additive structure, i.e., it can be represented aswhere each φ j ; j ∈ S r is at most r-variate for 1 ≤ r ≤ r 0 . We derive randomized algorithms that qu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 43 publications
0
1
0
Order By: Relevance
“…Preasymptotic estimation of high-dimensional problems were also treated in [16,17,18]. Related high-dimensional problems were studied in [22,30] based on ANOVA decomposition. The paper [11] has investigated dimension-dependent estimates of the approximation error for linear algorithms of sampling recovery on Smolyak grids of functions from the space with Hölder-Zygmund mixed smoothness.…”
Section: Introductionmentioning
confidence: 99%
“…Preasymptotic estimation of high-dimensional problems were also treated in [16,17,18]. Related high-dimensional problems were studied in [22,30] based on ANOVA decomposition. The paper [11] has investigated dimension-dependent estimates of the approximation error for linear algorithms of sampling recovery on Smolyak grids of functions from the space with Hölder-Zygmund mixed smoothness.…”
Section: Introductionmentioning
confidence: 99%