2012
DOI: 10.1287/opre.1110.1008
|View full text |Cite
|
Sign up to set email alerts
|

Sequential Importance Sampling and Resampling for Dynamic Portfolio Credit Risk

Abstract: We provide a sequential Monte Carlo method for estimating rare-event probabilities in dynamic, intensity-based point process models of portfolio credit risk. The method is based on a change of measure and involves a resampling mechanism. We propose resampling weights that lead, under technical conditions, to a logarithmically efficient simulation estimator of the probability of large portfolio losses. A numerical analysis illustrates the features of the method and contrasts it with other rare-event schemes rec… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2012
2012
2020
2020

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(3 citation statements)
references
References 28 publications
0
3
0
Order By: Relevance
“…Another line of research on modeling of portfolio credit risk focuses on dynamic intensity-based point process models. For such models, Deng et al [2012] recently proposed a sequential importance sampling and resampling scheme for estimating rare-event probabilities. They showed that a logarithmically efficient estimator of the probability of large loss can be obtained by selecting appropriate resampling weights.…”
Section: Credit Portfoliosmentioning
confidence: 99%
“…Another line of research on modeling of portfolio credit risk focuses on dynamic intensity-based point process models. For such models, Deng et al [2012] recently proposed a sequential importance sampling and resampling scheme for estimating rare-event probabilities. They showed that a logarithmically efficient estimator of the probability of large loss can be obtained by selecting appropriate resampling weights.…”
Section: Credit Portfoliosmentioning
confidence: 99%
“…There is a large literature on modelling PD-LGD correlation (Frye 2000); (Pykhtin 2003); (Miu and Ozdemir 2006); (Kupiec 2008); (Sen 2008); (Witzany 2011); (de Wit 2016); (Eckert et al 2016); and others listed in (Frye and Jacobs 2012), but there is a much smaller literature on using IS to estimate large deviation probabilities in such models. To the best of our knowledge only (Deng et al 2012) and (Jeon et al 2017) have developed algorithms that allow for PD-LGD correlation (the former paper considers a dynamic intensity-based framework, the latter considers a static model with asymmetric and heavy-tailed risk factors). The present paper contributes to this nascent literature by developing algorithms that can be applied in a wide variety of PD-LGD correlation models that have been proposed in the literature, and are popular in practice.…”
Section: Problem Formulation and Related Literaturementioning
confidence: 99%
“…Since typically it is hard to compute the loss distribution, approximations through limit theory or simulation are called for. For example, Deng, Giesecke and Lai [8] propose an importance sampling technique to estimate rare events probabilities in a credit risk portfolio. The method is based on a change of measure and resampling to approximate the zero-variance importance measure connected to the rare events.…”
Section: Introductionmentioning
confidence: 99%