2013
DOI: 10.1017/s1748499513000134
|View full text |Cite
|
Sign up to set email alerts
|

On the prediction of claim duration for income protection insurance policyholders

Abstract: This paper explores how we can apply various modern data mining techniques to better understand Australian Income Protection Insurance (IPI). We provide a fast and objective method of scoring claims into different portfolios using available rating factors. Results from fitting several prediction models are compared based on not only the conventional loss prediction error function, but also a modified loss function. We demonstrate that the prediction power of all the data mining methods under consideration is c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 15 publications
0
3
0
Order By: Relevance
“…For the popular prediction tasks related to automobile insurance, the reduction in dataset dimensionality is also useful. Liu et al (2014) reduce their large claim frequency prediction to a multi-class prediction problem to aid the eventual implementation of Adaptive Boosting (AdaBoost) to automobile insurance data. The act of reducing the number of frequency classes contributes to AdaBoost presenting as superior to SVM, NN, DTs and GLM in terms of prediction ability and interpretability.…”
Section: Dimensionality Reductionmentioning
confidence: 99%
“…For the popular prediction tasks related to automobile insurance, the reduction in dataset dimensionality is also useful. Liu et al (2014) reduce their large claim frequency prediction to a multi-class prediction problem to aid the eventual implementation of Adaptive Boosting (AdaBoost) to automobile insurance data. The act of reducing the number of frequency classes contributes to AdaBoost presenting as superior to SVM, NN, DTs and GLM in terms of prediction ability and interpretability.…”
Section: Dimensionality Reductionmentioning
confidence: 99%
“…LDA was originally developed in 1936 by Fisher, and it often produces models that obtain higher classification accuracies in comparison with more modern and complex classification methods [43]. It aims to maximize the ratio of the between-class variance to the within-class variance, and it provides the highest possible discrimination between different classes.…”
Section: Linear Discriminant Analysis (Lda)mentioning
confidence: 99%
“…Linear discriminant analysis goal is maximize the ratio of the between-class variance to the within-class variance, and it provides the highest possible discrimination between different classes. LDA is utilized in some of the recent ECG classification studies [10].k nearest neighbor has a wide usage in most of the pattern recognition problems and is also employed in some recent ECG classification studies [11].…”
Section: Introductionmentioning
confidence: 99%