2020
DOI: 10.48550/arxiv.2008.12927
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Broadcasted Nonparametric Tensor Regression

Abstract: We propose a novel broadcasting idea to model the nonlinearity in tensor regression non-parametrically. Unlike existing non-parametric tensor regression models, the resulting model strikes a good balance between flexibility and interpretability. A penalized estimation and corresponding algorithm are proposed. Our theoretical investigation, which allows the dimensions of the tensor covariate to diverge, indicates that the proposed estimation enjoys desirable convergence rate. Numerical experiments are conducted… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2026
2026

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 44 publications
0
5
0
Order By: Relevance
“…Thus, some works use basis functions to approximate the true functions using weighting factors. In other words, the true function which needs to be estimated can be represented by basis functions, such as finite multidimensional Fourier series (Wahls et al, 2014), polynomial splines (Hao et al, 2021), and B-splines (Zhou et al, 2020b). Specifically,…”
Section: Tensor Additive Modelsmentioning
confidence: 99%
See 2 more Smart Citations
“…Thus, some works use basis functions to approximate the true functions using weighting factors. In other words, the true function which needs to be estimated can be represented by basis functions, such as finite multidimensional Fourier series (Wahls et al, 2014), polynomial splines (Hao et al, 2021), and B-splines (Zhou et al, 2020b). Specifically,…”
Section: Tensor Additive Modelsmentioning
confidence: 99%
“…1561/2200000087 et al (2021) and Zhou et al (2020b) both assumed the coefficient tensor to be low CP rank, while low rank tensor train approximation is employed in Wahls et al (2014). In addition, the sparsity constraints are enforced over the latent factors to select the important correlated subregions for the response, such as the group sparsity constraint (Hao et al, 2021), and the elastic-net penalty (Zhou et al, 2020b). The alternating minimization method is used to update the latent factors.…”
Section: Tensor Additive Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…Indeed, most existing methods to solve the optimization over a low-CP-rank space are iterative algorithms (e.g. Guo et al, 2011;Tan et al, 2012;Zhou et al, 2013;Lock, 2018;Zhang et al, 2018;Hao et al, 2019;Zhou et al, 2020). Let θ t represent the output value of θ at the t-th iteration of the algorithm.…”
Section: Characterization Of Degeneracymentioning
confidence: 99%
“…Early tensor predictor regression solutions focus on parametric and usually linear or generalized linear type models (Zhou et al 2013). More recently, Hao et al (2019), Zhou et al (2020) extended tensor predictor regression to nonparametric models through basis expansion. Relatedly, Rabusseau & Kadri (2016), , Sun & Li (2017) studied tensor response regression under different low-dimensional structures, but all assumed linear association models and often assumed the normality distribution.…”
Section: Introductionmentioning
confidence: 99%