2018
DOI: 10.1016/j.matcom.2017.11.004
|View full text |Cite
|
Sign up to set email alerts
|

Inference in a bimodal Birnbaum–Saunders model

Abstract: We address the issue of performing inference on the parameters that index a bimodal extension of the Birnbaum-Saunders distribution (BS). We show that maximum likelihood point estimation can be problematic since the standard nonlinear optimization algorithms may fail to converge. To deal with this problem, we penalize the log-likelihood function. The numerical evidence we present shows that maximum likelihood estimation based on such penalized function is made considerably more reliable. We also consider hypot… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 50 publications
0
2
0
Order By: Relevance
“…Three real data illustrations indicated that the proposed BBS model provides better adjustment compared to the BBSO model proposed by Olmos et al (2017). The proposed BBS distribution has some advantages over existing bimodal BS models: (i) unlike the BBSO model, the proposed BBS distribution does not suffer from convergence problems in the optimization process of the profile log-likelihood function as pointed out by Fonseca and Cribari (2018); (ii) the proposed BBS distribution does not present identifiability problems commonly encountered in mixture models, such as the MXBS distribution; and (iii) the proposed model does not present label switching problems (Celeux et al, 2006), that is, in a bimodal context with two groups, during the estimation an individual who was in group B can incorrectly stay in A and vice versa.…”
Section: Introductionmentioning
confidence: 87%
See 1 more Smart Citation
“…Three real data illustrations indicated that the proposed BBS model provides better adjustment compared to the BBSO model proposed by Olmos et al (2017). The proposed BBS distribution has some advantages over existing bimodal BS models: (i) unlike the BBSO model, the proposed BBS distribution does not suffer from convergence problems in the optimization process of the profile log-likelihood function as pointed out by Fonseca and Cribari (2018); (ii) the proposed BBS distribution does not present identifiability problems commonly encountered in mixture models, such as the MXBS distribution; and (iii) the proposed model does not present label switching problems (Celeux et al, 2006), that is, in a bimodal context with two groups, during the estimation an individual who was in group B can incorrectly stay in A and vice versa.…”
Section: Introductionmentioning
confidence: 87%
“…In addition, the authors also studied the probabilistic properties and moments of the BBSO distribution, and showed that this model can fit well both unimodal and bimodal data in comparison with the BS, log-normal and skew-normal BS models. A thorough inference study on the parameters that index the BBSO distribution was addressed by Fonseca and Cribari (2018).…”
Section: Introductionmentioning
confidence: 99%