Robust estimation of power spectra, coherences, and transfer functions is investigated in the context of geophysical data processing. The methods described are frequency-domain extensions of current techniques from the statistical literature and are applicable in cases where section-averaging methods would be used with data that are contaminated by local nonstationarity or isolated outliers. The paper begins with a review of robust estimation theory, emphasizing statistical principles and the maximum likelihood or M-estimators. These are combined with section-averaging spectral techniques to obtain robust estimates of power spectra, coherences, and transfer functions in an automatic, data-adaptive fashion. Because robust methods implicitly identify abnormal data, methods for monitoring the statistical behavior of the estimation process using quantile-quantile plots are also discussed. The results are illustrated using a variety of examples from electromagnetic geophysics. INTRODUCTIONReliable estimation of power spectra for single data sequences or of transfer functions and coherences between multiple time series is of central importance in many areas of geophysics and engineering. While the effects of the underlying Gaussian distributional assumptions on such estimates are generally understood, the ability of a small fraction of non-Gaussian noise or localized nonstationarity to affect them is not. These phenomena can destroy conventional estimates, often in a manner that is difficult to detect.Problems with conventional (i.e., nonrobust) time series procedures arise because they are essentially copies of classical statistical procedures parameterized by frequency. Once Fourier transforms are taken, estimating a spectrum is the same process as computing a variance, and estimating a transfer function is a similar procedure to linear regression. Because these methods are based on the least squares or Gaussian maximum likelihood approaches to statistical inference, their advantages include simplicity and the optimality properties established by the Gauss- Paper number 5B5911 0148-0227/87/005B-5911505.00 residuals are drawn from a multivariate normal probability distribution, then the least squares result is also a maximum likelihood, fully efficient, minimum variance estimate. In practice, the regression model is rarely an accurate description due to departures of the data from the model requirements. Most data contain a small fraction of unusual observations or "outliers" that do not fit the model distribution or share the characteristics of the bulk of the sample. These can often be described by a probability distribution which has a nearly Gaussian shape in the center and tails which are heavier than would be expected for a normal one, or by mixtures of Gaussian distributions with different variances.Two forms of data outliers are common: point defects and local nonstationarity. Point defects are isolated outliers that exist independent of the structure of the process under study. In this paper the principles o...
We have compiled both laboratory and worldwide field data on electrical conductivity to help understand the physical implications of deep crustal electrical profiles. Regional heat flow was used to assign temperatures to each layer in regional electrical conductivity models; we avoided those data where purely conductive heat flow suggested temperatures more than about 1000°C, substantially higher than solidus temperatures and outside the range of validity of heat flow models. The resulting plots of log conductivity σ versus 1/T demonstrate that even low‐conductivity layers (LCL) have conductivities several orders of magnitude higher than dry laboratory samples and that the data can be represented by straight line fits. In addition, technically active regions show systematically higher conductivities than do shield areas. Because volatiles are usually lost in laboratory measurements and their absence is a principal difference between laboratory and field conditions, these materials probably account for the relatively higher conductivities of rocks in situ in the crust; free water in amounts of 0.01–0.1% in fracture porosity could explain crustal conductivities. Other possibilities are graphite, hydrated minerals in rare instances, and sulfur in combination with other volatiles. As most of the temperatures are less than 700°C, partial melting seems likely only in regions of highest heat flow where the conductive temperature profiles are inappropriate. Another result is that at a given temperature, crustal high‐conductivity layers (HCL) are more conductive by another order of magnitude and show more scatter than do LCL's. Because the differences between HCL's and LCL's are independent of temperature, we must invoke more than temperature increases as a cause for large conductivity increases; increased fluid concentration in situ seems a probable cause for enhanced conductivities in HCL's. From the point of view of these observations, it does not matter whether the fluids are in communication with the surface or trapped at lithostatic pressures.
The gravity method was the first geophysical technique to be used in oil and gas exploration. Despite being eclipsed by seismology, it has continued to be an important and sometimes crucial constraint in a number of exploration areas. In oil exploration the gravity method is particularly applicable in salt provinces, overthrust and foothills belts, underexplored basins, and targets of interest that underlie high-velocity zones. The gravity method is used frequently in mining applications to map subsurface geology and to directly calculate ore reserves for some massive sulfide orebodies. There is also a modest increase in the use of gravity techniques in specialized investigations for shallow targets. Gravimeters have undergone continuous improvement during the past 25 years, particularly in their ability to function in a dynamic environment. This and the advent of global positioning systems (GPS) have led to a marked improvement in the quality of marine gravity and have transformed airborne gravity from a regional technique to a prospect-level exploration tool that is particularly applicable in remote areas or transition zones that are otherwise inaccessible. Recently, moving-platform gravity gradiometers have become available and promise to play an important role in future exploration. Data reduction, filtering, and visualization, together with low-cost, powerful personal computers and color graphics, have transformed the interpretation of gravity data. The state of the art is illustrated with three case histories: 3D modeling of gravity data to map aquifers in the Albuquerque Basin, the use of marine gravity gradiometry combined with 3D seismic data to map salt keels in the Gulf of Mexico, and the use of airborne gravity gradiometry in exploration for kimberlites in Canada.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.