2021
DOI: 10.1007/s00180-021-01144-w
|View full text |Cite
|
Sign up to set email alerts
|

Consistent second-order discrete kernel smoothing using dispersed Conway–Maxwell–Poisson kernels

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 11 publications
0
5
0
Order By: Relevance
“…To conclude this section, we highlight some of our previous results on the recent CoM-Poisson kernel estimator of Huang et al (2021) and compare with the classical binomial one. In fact, we consider the refined version of the CoM-Poisson kernel satisfying (A1) and (A2) as follows: T = N = S x for each x ∈ N and any h > 0,…”
Section: Theorem 24 (Asymptotic Normality) Let (A2) Be Satisfied If T...mentioning
confidence: 93%
See 1 more Smart Citation
“…To conclude this section, we highlight some of our previous results on the recent CoM-Poisson kernel estimator of Huang et al (2021) and compare with the classical binomial one. In fact, we consider the refined version of the CoM-Poisson kernel satisfying (A1) and (A2) as follows: T = N = S x for each x ∈ N and any h > 0,…”
Section: Theorem 24 (Asymptotic Normality) Let (A2) Be Satisfied If T...mentioning
confidence: 93%
“…Several authors pointed out the use of a discrete associated kernel from Dirac and discrete triangular kernels (Kokonendji et al, 2007;Kokonendji and Zocchi, 2010) and also from extensions of Dirac kernels proposed by Aitchison and Aitken (1976) for categorial data and Wang and Van Ryzin (1981). Furthermore, we have count kernels as the binomial (Kokonendji and Senga Kiessé, 2011) and, recently, the CoM-Poisson (Huang et al, 2021) kernels which are both underdispersed (i.e., variance less than mean). See also Harfouche et al (2018) and Senga Kiessé (2017) for other properties.…”
Section: Introductionmentioning
confidence: 99%
“…x! , x ∈ N; Table 1 presents the computation times required to perform all ISE bandwidth selection techniques (7) for gamma-count, double Poisson, binomial and CoM-Poisson smoothers based on a single replication of sample sizes ranging from n = 20 to 500 for the target function C. For all sample sizes, the results show that the CoM-Poisson is the most time consuming followed by the double Poisson smoother mainly due to the normalizing constant in their expressions, ( 5) and (3), respectively. As the sample sizes increase, the binomial kernel outperforms in terms of CPU times due to its support S x = {0, 1, .…”
Section: Simulation Studies and An Application To Real Datamentioning
confidence: 99%
“…They were developed using the proposed mean dispersion method. Also, we considered the integrated squared error method (7) to select as quickly and efficiently as possible the bandwidth of their corresponding estimations. Through simulation experiments and real count data analysis, we demonstrated that these kernels perform better than the binomial kernel, while falling between the CoM-Poisson kernel smoothing (which performs the best) and the binomial kernel (which performs the worst).…”
Section: Summary and Final Remarksmentioning
confidence: 99%
See 1 more Smart Citation