2019
DOI: 10.1142/s2010326320500173
|View full text |Cite
|
Sign up to set email alerts
|

Empirical likelihood for high-dimensional partially functional linear model

Abstract: This paper considers empirical likelihood inference for a high-dimensional partially functional linear model. An empirical log-likelihood ratio statistic is constructed for the regression coefficients of non-functional predictors and proved to be asymptotically normally distributed under some regularity conditions. Moreover, maximum empirical likelihood estimators of the regression coefficients of non-functional predictors are proposed and their asymptotic properties are obtained. Simulation studies are conduc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 18 publications
0
1
0
Order By: Relevance
“… Estimation procedures: there are different methods and studies for estimation procedures of SFPLR model such as: fully automatic estimation procedure with the data-driven method and cross-validation for bandwidth selection of the smoothing parameter of nonparametric component [7], the asymptotic normality of linear part is studied [8], in a situations when the observation number of each subject is completely flexible and studying the convergence rate of the nonparametric part [9], the spline estimator of nonparametric part with studying their convergence rate [10], the two estimation procedure: 1-functional principal components regression (FPCR) and 2-functional ridge regression (FRR) based on the Tikhonov regularization [11][12][13], the new estimators for the parametric component called semiparametric least squares estimator (SLSE) [14], the nonparametric component approximated by a B-spline function [15], polynomial spline [16] and the slope function is estimated with the functional principal component basis [15,16], the k-nearest-neighbours (kNN) estimates with the local adaptive property that is better in the practice than kernel methods and some computations properties of this estimator [17][18][19], the Functional Semiparametric Additive Model via COmponent Selection and Smoothing Operator (FSAM-COSSO) in sparse setting [20], the sufficient dimension reduction methods such as sliced inverse regression (SIR) and sliced average variance estimation (SAVE) [21], the estimations are from the reproducing kernel Hilbert spaces (RKHS) [22], the frequentist and optimal model averaging [23], the latent group structure with K-means clustering [24], the joint asymptotic framework called joint Bahadur representation [25], the empirical likelihood estimation for non-functional high-dimension covariates [26], Sparse and penalized least-squares estimators [27] and the software for doing this analysis is available [28] .  Confidence Regions: Some papers have some sections for calculating confidence regions ,and we do not repeat them.…”
Section: Other Extensionsmentioning
confidence: 99%
“… Estimation procedures: there are different methods and studies for estimation procedures of SFPLR model such as: fully automatic estimation procedure with the data-driven method and cross-validation for bandwidth selection of the smoothing parameter of nonparametric component [7], the asymptotic normality of linear part is studied [8], in a situations when the observation number of each subject is completely flexible and studying the convergence rate of the nonparametric part [9], the spline estimator of nonparametric part with studying their convergence rate [10], the two estimation procedure: 1-functional principal components regression (FPCR) and 2-functional ridge regression (FRR) based on the Tikhonov regularization [11][12][13], the new estimators for the parametric component called semiparametric least squares estimator (SLSE) [14], the nonparametric component approximated by a B-spline function [15], polynomial spline [16] and the slope function is estimated with the functional principal component basis [15,16], the k-nearest-neighbours (kNN) estimates with the local adaptive property that is better in the practice than kernel methods and some computations properties of this estimator [17][18][19], the Functional Semiparametric Additive Model via COmponent Selection and Smoothing Operator (FSAM-COSSO) in sparse setting [20], the sufficient dimension reduction methods such as sliced inverse regression (SIR) and sliced average variance estimation (SAVE) [21], the estimations are from the reproducing kernel Hilbert spaces (RKHS) [22], the frequentist and optimal model averaging [23], the latent group structure with K-means clustering [24], the joint asymptotic framework called joint Bahadur representation [25], the empirical likelihood estimation for non-functional high-dimension covariates [26], Sparse and penalized least-squares estimators [27] and the software for doing this analysis is available [28] .  Confidence Regions: Some papers have some sections for calculating confidence regions ,and we do not repeat them.…”
Section: Other Extensionsmentioning
confidence: 99%