2022
DOI: 10.48550/arxiv.2207.12826
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Variable Transformations in combination with Wavelets and ANOVA for high-dimensional approximation

Abstract: We use hyperbolic wavelet regression for the fast reconstruction of high-dimensional functions having only low dimensional variable interactions. Compactly supported periodic Chui-Wang wavelets are used for the tensorized hyperbolic wavelet basis on the torus. With a variable transformation we are able to transform the approximation rates and fast algorithms from the torus to other domains. We perform and analyze scattered-data approximation for smooth but arbitrary density functions by using a least squares m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 29 publications
0
1
0
Order By: Relevance
“…The approximation of high-dimensional functions is a classical topic of mathematical analysis with rich real-world applications. Due to new numerical techniques arising from (stochastic) Fourier-and wavelet [7,43,36,37] as well as kernel [40] methods and deep learning approaches, the topic has recently attained increasing attention. For example, including information about the structure of the function class of interest into a deep neural network architecture can improve its approximation quality [3,17,14].…”
mentioning
confidence: 99%
“…The approximation of high-dimensional functions is a classical topic of mathematical analysis with rich real-world applications. Due to new numerical techniques arising from (stochastic) Fourier-and wavelet [7,43,36,37] as well as kernel [40] methods and deep learning approaches, the topic has recently attained increasing attention. For example, including information about the structure of the function class of interest into a deep neural network architecture can improve its approximation quality [3,17,14].…”
mentioning
confidence: 99%