S U M M A R YMoment tensor inversion of early XX century mechanical seismograph recordings offers the opportunity to estimate source parameters for a number of important earthquakes. This knowledge is crucial for understanding regional tectonics and estimating seismic hazard; but processing of the historical seismograms is often difficult. Especially, the rotation of horizontal seismograms into radial (P-SV ) and transverse (SH) components tends to fail; for example, if one polarity is wrong, if the alignment of the seismograms is not correct, or if one component is lost. To avoid these kind of problems, we use a moment-tensor-inversion scheme that processes the original single-component horizontal traces without rotation, and rotates the theoretical Green's functions instead. We show how this approach simplifies the analysis, and allows a better use of available data. We present moment tensor solutions for two destructive earthquakes in southwestern Europe in 1909. The April 23 earthquake near Benavente in the lower Tagus valley (Portugal) had a moment magnitude M W = 6.0 and an estimated centroid depth of 10 km. Our preferred moment tensor solution indicates reverse faulting (nodal planes have strike/dip/rake of N51 • E/52 • /83 • and N242 • E/38 • /99 • ). We propose a blind thrust beneath the Tagus valley sediment basin as the responsible fault. The June 11 earthquake near Lambesc, Provence (France) was found to be slightly smaller (M W = 5.5) and related to oblique reverse faulting at 4-km depth. Nodal planes of the preferred solution have strike/dip/rake of N80 • E/53 • /53 • and N311 • E/50 • /128 • , subparallel to the Trévaresse fold.
International audienceFor an optimal analysis of the H/V curve, it appears necessary to check the instrument signal to noise ratio in the studied frequency band, to ensure that the signal from the ground noise is well above the internal noise. We assess the reliability and accuracy of various digitizers, sensors and/or digitizer-sensor couples. Although this study is of general interest for any kind of seismological study, we emphasize the influence of equipment on H/V analysis results. To display the impact of the instrumental part on the H/V behavior, some series of tests have been carried out following a step-by-step procedure: first, the digitizers have been tested in the lab (sensitivity, internal noise...), then the three components sensors, still in the lab, and finally the usual user digitizers-sensors couple in lab and outdoors. In general, the digitizer characteristics, verified during this test, correspond well to the manufacturer specifications, however, depending on the digitizer, the quality of the digitized waveform can be very good to very poor, with variation from a channel to another channel (gain, time difference etc.). It appears very clearly that digitizers need a warming up time before the recording to avoid problems in the low-frequency range. Regarding the sensors, we recommend strongly to avoid the use of “classical” accelerometers (i.e., usual force balance technology). The majority of tested seismometers (broadband and short period, even 4.5 Hz) can be used without problems from 0.4 to 25 Hz. In all cases, the instrumentation should be checked first to verify that it works well for the defined study aim, but also to define its limit of use (frequency, sensitivity...)
A series of investigations has been carried out over the last decade in Europe aimed at deriving quantitative information on site amplification from non-invasive techniques, based principally on surface wave interpretations of ambient noise measurements. The present paper focuses on their key outcomes regarding three main topics. First, methodological, hardware and software developments focusing on the acquisition and the processing of both single point and array microtremor measurements, led to an efficient tool with in situ control and processing, giving rise to robust and reproducible results. A special attention has been devoted to the derivation and use of the Rayleigh wave ellipticity. Second, the reliability of these new tools has been assessed through a thorough comparison with borehole measurements for a representative – though limited – set of sites located in Southern Europe, spanning from stiff to soft, and shallow to thick. Finally, correlations between the site parameters available from such non-invasive techniques, and the actual site amplification factors as measured with standard techniques, are derived from a comprehensive analysis of the Japanese KIKNET data. This allows to propose alternative, simple site characterization providing an improved variance reduction compared with the “classical” VS30 classification. While these results could pave the road for the next generation of building codes, they can also be used now for regulatory site classification and microzonation studies, in view of improved mapping and estimation of site amplification factors, and for the characterization of existing strong motion sites
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.