In this paper we present a penalized likelihood-based method for spatial estimation of GutenbergRichter's b value. Our method incorporates a nonarbitrary partitioning scheme based on Voronoi tessellation, which allows for the optimal partitioning of space using a minimum number of free parameters. By random placement of an increasing number of Voronoi nodes, we are able to explore the whole solution space in terms of model complexity. We obtain an overall likelihood for each model by estimating the b values in all Voronoi regions and calculating its joint likelihood using Aki's formula. Accounting for the number of free parameters, we then calculate the Bayesian Information Criterion for all random realizations. We investigate the ensemble of the best performing models and demonstrate the robustness and validity of our method through extensive synthetic tests. We apply our method to the seismicity of California using two different time spans of the Advanced National Seismic System catalog (1984-2014 and 2004-2014). The results show that for the last decade, the b value variation in the well-instrumented parts of mainland California is limited to the range of (0.94 ± 0.04-1.15 ± 0.06). Apart from the Geysers region, the observed variation can be explained by network-related discrepancies in the magnitude estimations. Our results suggest that previously reported spatial b value variations obtained using classical fixed radius or nearest neighbor methods are likely to have been overestimated, mainly due to subjective parameter choices. We envision that the likelihoodbased model selection criteria used in this study can be a useful tool for generating improved earthquake forecasting models.
Abstract:We present a novel method to estimate the multifractal spectrum of point distributions. The method incorporates two motivated criteria (barycentric pivot point selection and non-overlapping coverage) in order to reduce edge effects, improve precision and reduce computation time. Implementation of the method on synthetic benchmarks demonstrates the superior performance of the proposed method compared with existing alternatives routinely used in the literature. Finally, we use the method to estimate the multifractal properties of the widely studied growth process of Diffusion Limited Aggregation and compare our results with recent and earlier studies. Our tests support the conclusion of a genuine but weak multifractality of the central core of DLA clusters, with D q decreasing from 1.75±0.01 for q=-10 to 1.65±0.01 for q=+10. 1-IntroductionSince their popularization by Mandlebrot [1], fractals and fractal geometry have been empirically observed and extensively studied in a wealth of natural and experimental physical phenomena. A common way to quantify the fractal or multifractal properties of a given set of data points is to calculate its generalized (Renyi) dimensions [2], given as:where ε is the scale of observation, p i (ε) is the fraction of data points (e.g, estimated measure) within box i of size ε, q is a real-valued moment order and the sum is performed over all boxes covering the data set under investigation.
Tormann et al. [2014] propose a distance exponential weighted (DEW) b value mapping approach as an improvement to previous methods of constant radius and nearest neighborhood. To test the performance of their proposed method the authors introduce a score function:where N is the total number of grid nodes, n is the number of nonempty nodes, and b true and b est are the true and estimated b values, respectively. This score function is applied on a semisynthetic earthquake catalog to make inference on the parameters of the method. In this comment we argue that the proposed methodology cannot be applied to seismic analysis since it requires a priori knowledge of the spatial b value distribution, which it aims to reveal.The score function given in equation (1) seeks to minimize the absolute difference between the generating (true) and estimated b values. Like any statistical parameter, the estimation error of the b value decreases with increasing sample size. However, since the b value is a measure of slope on a log-log plot, for any given sample size, the estimation error is not only asymmetric but also its amplitude is dependent on the b value itself.To make a pedagogical analogy; conducting b value analysis by limiting the sample size (Tormann et al. use 150) is similar to measuring temperature with a peculiarly short thermometer. For small sample sizes, such a thermometer would measure lower temperatures more precisely than higher ones, to the extent that it would be impossible to distinguish between values on the upper end. Only when the sample size, and hence the resolution, is increased does the thermometer become reliable across the whole measurement range. For an illustration in Figure 1a, we show the confidence intervals for 6 b values; accordingly, we divide our thermometer into 6 intervals. The length of each interval is scaled with 1/σ N as a proxy for resolution, where σ N is the standard deviation for N number of samples. The analogous thermometers obtained for N = 150, 300, and 1500 are shown in Figure 1b.Intersecting the confidence interval curves of different b values, one can derive that the crudest measurement in the range [0.6-1.5] would require a resolution of Δb = 0.3 that is achieved with at least~170 samples. An analysis in this constrained setting can correspond only to a mere classification of b values as low (0.70 ± 0.10), medium (1.00 ± 0.13), or high (1.30 ± 0.17). Such a graphical inference is, however, an oversimplification because determining if two Gutenberg-Richter (GR) distributions are significantly different (for a given p value) would require nested-hypothesis tests (see supporting information).Compared to the previous sampling techniques, the proposed DEW method features two additional parameters; number of maximum events (N max ) and coefficient of exponential magnitude weight decay with distance (λ). The remaining parameters, common with the previous approaches, are set to the following values: minimum number of events (N min = 50), maximum search radius (R max = 7.5), and m...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.