2020
DOI: 10.1109/tsp.2020.2970311
|View full text |Cite
|
Sign up to set email alerts
|

Low-Complexity Methods for Estimation After Parameter Selection

Abstract: Statistical inference of multiple parameters often involves a preliminary parameter selection stage. The selection stage has an impact on subsequent estimation, for example by introducing a selection bias. The post-selection maximum likelihood (PSML) estimator is shown to reduce the selection bias and the post-selection mean-squared-error (PSMSE) compared with conventional estimators, such as the maximum likelihood (ML) estimator. However, the computational complexity of the PSML is usually high due to the mul… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
7
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4
3

Relationship

3
4

Authors

Journals

citations
Cited by 7 publications
(7 citation statements)
references
References 76 publications
0
7
0
Order By: Relevance
“…1) mmCCRB on the mmMSE of missing-mass unbiased estimators: For the sake of simplicity of derivation, we assume in this subsection that b N,0 (θ) = 0. According to Proposition 1, this condition is a sufficient condition for the Lehmann unbiasedness in (32) and (16). For this case and the classical model described by (3), it can be verified that S(θ) from ( 38) is a diagonal matrix with the diagonal elements…”
Section: Special Casesmentioning
confidence: 72%
See 3 more Smart Citations
“…1) mmCCRB on the mmMSE of missing-mass unbiased estimators: For the sake of simplicity of derivation, we assume in this subsection that b N,0 (θ) = 0. According to Proposition 1, this condition is a sufficient condition for the Lehmann unbiasedness in (32) and (16). For this case and the classical model described by (3), it can be verified that S(θ) from ( 38) is a diagonal matrix with the diagonal elements…”
Section: Special Casesmentioning
confidence: 72%
“…In particular, the CCRB [26][27][28][29], which is associated with the CML estimator, is unsuited as a bound on the performance of Good-Turing estimators outside the asymptotic region, while it provides a lower bound on the MSE of any χ-unbiased estimator [28][29][30]. Our recent works on non-Bayesian estimation after selection [31][32][33] suggest that conditional schemes, in which the performance criterion depends on the observed data, require different CRB-type bounds.…”
Section: B Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…However, in [38], [39], [40], [41], the useful data is selected and not the model. In [42], [43], [44], [45], we developed the CRB and estimation methods for models whose "parameters of interest" are selected based on the data, i.e. estimation after parameter selection, in which the model is perfectly known.…”
Section: B Related Workmentioning
confidence: 99%