The spatial resolution of exposure data has a substantial impact on the accuracy and reliability of seismic risk estimates.While several studies have investigated the influence of the geographical detail of urban exposure data in earthquake loss models, there is also a need to understand its implications at the regional scale. This study investigates the effects of exposure resolution on the European loss model and its influence on the resulting loss estimates by simulating dozens of exposure and site models (630 models) representing a wide range of assumptions related to the geo-resolution of the exposed asset locations and the associated site conditions. Losses are examined in terms of portfolio average annual loss (AAL) and return period losses at national and sub-national levels. The results indicate that neglecting the uncertainty related to asset locations and their associated site conditions within an exposure model can introduce significant bias to the risk results. The results also demonstrate that disaggregating exposure to a grid or weighting/relocating exposure locations and site properties using a density map of the built areas can improve the accuracy of the estimated losses.
<p>The substantial reduction of disaster risk and loss of life, a major goal of the Sendai Framework by the United Nations Office for Disaster Risk Reduction (UNISDR), requires a clear understanding of the dynamics of the built environment and how they affect, in the case of natural disasters, the life of communities, represented by local governments and individuals. These dynamics can be best understood and captured by the local communities themselves, following two of the guiding principles formulated by the UNISDR: "empowerment of local authorities and communities" and "engagement from all of society". The two lead to societies increasing their understanding of efficient risk mitigation measures.</p><p>Our Global Dynamic Exposure model and its technical infrastructure build on the involvement of communities in a citizen-science approach. We are employing a crowd-sourced exposure capturing based on OpenStreetMap (OSM), an ideal foundation with already more than 375 million building footprints (growing daily by ~150,000), and a plethora of information about school, hospital, and other critical facilities. We are harvesting this dataset with our OpenBuildingMap system by processing the information associated with every building in near-real-time. We are enriching this dataset in a truly big-data approach by including built-up area detection from remote sensing with satellite and radar imagery combined with different sources of road networks, as well as various open datasets and aggregated exposure models that provide relevant additional information on, buildings and land use.&#160;</p><p>A task of such a scale does not come without challenges, particularly in matters of data completeness, privacy and the merging and homogenizing of different datasets. We are thus investing a large effort on the development of strategies to tackle these in a transparent and consistent way.</p><p>We are fully automatically collecting exposure and vulnerability indicators from explicitly provided data (e.g., hospital locations), implicitly provided data (e.g., building shapes and positions), and semantically derived data, that is, interpretation applying expert knowledge. The latter allows for the translation of simple building properties as captured by OpenStreetMap users or taken from open datasets into vulnerability and exposure indicators and subsequently into building classifications as defined in the Building Taxonomy 2.0 developed by the Global Earthquake Model (GEM) and in the European Macroseismic Scale (EMS98). A task of such a scale does not come without challenges, particularly in matters of data completeness, privacy and the merging and homogenizing of different datasets. We are thus investing a large effort on the development of strategies to tackle these in a transparent and consistent way. With our open approach, we increase the resolution of existing exposure models minute by minute through data updates and step by step with each added building, as we move forward from aggregated to building-by-building descriptions of exposure.&#160;</p><p>We expect the quality of near-real-time estimates of the extent of natural disasters to increase by an order of magnitude, based on the data we are collecting. We envision authorities and first responders greatly benefitting form maps pinpointing the greatest trouble spots in disasters and from detailed quantitative estimates of the likely damage and human losses.</p>
Despite their much smaller individual contribution to the global counts of casualties and damage than their larger counterparts, earthquakes with moment magnitudes Mw in the range 4.0–5.5 may dominate seismic hazard and risk in areas of low overall seismicity, a statement that is particularly true for regions where anthropogenically-induced earthquakes are predominant. With the risk posed by these earthquakes causing increasing alarm in certain areas of the globe, it is of interest to determine what proportion of earthquakes in this magnitude range that occur sufficiently close to population or the built environment do actually result in damage and/or casualties. For this purpose, a global catalogue of potentially damaging events—that is, earthquakes deemed as potentially capable of causing damage or casualties based on a series of pre-defined criteria—has been generated and contrasted against a database of reportedly damaging small-to-medium earthquakes compiled in parallel to this work. This paper discusses the criteria and methodology followed to define such a set of potentially damaging events, from the issues inherent to earthquake catalogue compilation to the definition of criteria to establish how much potential exposure is sufficient to consider each earthquake a threat. The resulting statistics show that, on average, around 2% of all potentially-damaging shocks were actually reported as damaging, though the proportion varies significantly in time as a consequence of the impact of accessibility to data on damage and seismicity in general. Inspection of the years believed to be more complete suggests that a value of around 4–5% might be a more realistic figure.
For the seismic design of a structure, horizontal ground shaking is usually considered in two perpendicular directions, even though real horizontal ground motions are complex two-dimensional phenomena that impose different demands at different orientations. While the issue of ground motion dependence on the orientation of the recording devices has been the focus of many significant developments during the last decade, the effects of directionality on the characteristics of the structure have received less attention. This work presents a proposal to calculate the probability of exceedance of elastic spectral displacements accounting for structural typology and illustrates its relevance by means of its application to two case-study buildings. In order to ease its implementation in seismic design codes, a simplification is developed by means of a detailed statistical analysis of the results obtained using four sets of real hazard curves. The framework presented herein is considered to represent an important contribution to the field of performance-based earthquake engineering, permitting improved treatment of directionality effects within seismic risk design and assessment.
Interest in small-to-medium magnitude earthquakes and their potential consequences has increased significantly in recent years, mostly due to the occurrence of some unusually damaging small events, the development of seismic risk assessment methodologies for existing building stock, and the recognition of the potential risk of induced seismicity. As part of a clear ongoing effort of the earthquake engineering community to develop knowledge on the risk posed by smaller events, a global database of earthquakes with moment magnitudes in the range from 4.0 to 5.5 for which damage and/or casualties have been reported has been compiled and is made publicly available. The two main purposes were to facilitate studies on the potential for earthquakes in this magnitude range to cause material damage and to carry out a statistical study to characterise the frequency with which earthquakes of this size cause damage and/or casualties (published separately). The present paper describes the data sources and process followed for the compilation of the database, while providing critical discussions on the challenges encountered and decisions made, which are of relevance for its interpretation and use. The geographic, temporal, and magnitude distributions of the 1958 earthquakes that make up the database are presented alongside the general statistics on damage and casualties, noting that these stem from a variety of sources of differing reliability. Despite its inherent limitations, we believe it is an important contribution to the understanding of the extent of the consequences that may arise from earthquakes in the magnitude range of study.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.