1996
DOI: 10.1061/(asce)0733-950x(1996)122:2(93)
|View full text |Cite
|
Sign up to set email alerts
|

Empirical Simulation Technique Based Storm Surge Frequency Analyses

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
15
0
1

Year Published

2009
2009
2021
2021

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 47 publications
(16 citation statements)
references
References 2 publications
0
15
0
1
Order By: Relevance
“…If we assume that F ( ζ max ) is time invariant, this can be related to return period [ T R ( ζ max )] by where λ is the rate of hurricane landfall occurrence, specified as the Poisson frequency parameter, taken to be the average number of hurricanes per year making landfall within some prescribed distance along the coast. Surge frequency is then determined by fitting a parametric (e.g., Gumbel, Weibull, generalized extreme‐value (GEV)) or nonparametric (e.g., empirical simulation technique [ Borgman et al , 1992; Scheffner et al , 1996]) distribution to the historical surge data.…”
Section: Introductionmentioning
confidence: 99%
“…If we assume that F ( ζ max ) is time invariant, this can be related to return period [ T R ( ζ max )] by where λ is the rate of hurricane landfall occurrence, specified as the Poisson frequency parameter, taken to be the average number of hurricanes per year making landfall within some prescribed distance along the coast. Surge frequency is then determined by fitting a parametric (e.g., Gumbel, Weibull, generalized extreme‐value (GEV)) or nonparametric (e.g., empirical simulation technique [ Borgman et al , 1992; Scheffner et al , 1996]) distribution to the historical surge data.…”
Section: Introductionmentioning
confidence: 99%
“…Historical initial condition are usually not sufficient to catch all possible realizations that can occur: here we applied the methodology employed by James and Mason (2005) and Scheffner et al (1996), using the historical database, to obtain a new set of synthetic initial condition representative of a sufficient large number of cyclone. Historical initial condition are usually not sufficient to catch all possible realizations that can occur: here we applied the methodology employed by James and Mason (2005) and Scheffner et al (1996), using the historical database, to obtain a new set of synthetic initial condition representative of a sufficient large number of cyclone.…”
Section: Synthetic Tropical Cyclone Databasementioning
confidence: 99%
“…For example, in the US, cost-benefit calculations by the US Army Corps of Engineers (USACE) for coastal programs and programs within the Federal Emergency Management Agency (FEMA) such as the National Flood Insurance Program (NFIP) consider an expected statistical behavior over a multi-year scale, as do most planning groups worldwide; so this will be the focus here. Prior to Hurricane Katrina in 2005, the flood hazard assessments transitioned from the Joint Probability Method (JPM) in the 1970s [60,25] to a Historical Storm Method (HSM) in the 1990s [69] based on a Points Over Threshold (POT) approach combined with Monte Carlo simulations to quantify potential uncertainty. The use of Monte Carlo to quantify uncertainty provides an unbiased estimate of errors, assuming the initial estimate provides an unbiased estimate of the parent population given all quantiles of the sample match the equivalent quantiles of the parent population.…”
Section: Estimating Tropical Cyclone Surge Hazard On a Multi-year Scalementioning
confidence: 99%