Abstract. Detection of long-term, linear trends is affected by a number of factors, including the size of trend to be detected, the time span of available data, and the magnitude of variability and autocorrelation of the noise in the data. The number of years of data necessary to detect a trend is strongly dependent on, and increases with, the magnitude of variance (o-2•) and autocorrelation coefficient (qb) of the noise. For a typical range of values of o-2• and 4> the number of years of data needed to detect a trend of 5%/decade can vary from -10 to >20 years, implying that in choosing sites to detect trends some locations are likely to be more efficient and cost-effective than others. Additionally, some environmental variables allow for an earlier detection of trends than other variables because of their low variability and autocorrelation. The detection of trends can be confounded when sudden changes occur in the data, such as when an instrument is changed or a volcano erupts. Sudden level shifts in data sets, whether due to artificial sources, such as changes in instrumentation or site location, or natural sources, such as volcanic eruptions or local changes to the environment, can strongly impact the number of years necessary to detect a given trend, increasing the number of years by as much as 50% or more. This paper provides formulae for estimating the number of years necessary to detect trends, along with the estimates of the impact of interventions on trend detection. The uncertainty associated with these estimates is also explored. The results presented are relevant for a variety of practical decisions in managing a monitoring station, such as whether to move an instrument, change monitoring protocols in the middle of a long-term monitoring program, or try to reduce uncertainty in the measurements by improved calibration techniques. The results are also useful for establishing reasonable expectations for trend detection and can be helpful in selecting sites and environmental variables for the detection of trends. An important implication of these results is that it will take several decades of high-quality data to detect the trends likely to occur in nature. IntroductionThe impact of human intervention in a changing environment has brought about increased concern for detecting trends in various types of environmental data. A variety of studies
This paper is concerned with temporal data requirements for the assessment of trends and for estimating spatial correlations of atmospheric species.We examine statistically three basic issues:(1) the effect of autocorrelations in monthly observations and the effect of the length of data record on the precision of trend estimates, (2) the effect of autocorrelations in the daily data on the sampling frequency requirements with respect to the representativeness of monthly averages for trend estimation, and (3) the effect of temporal sampling schemes on estimating spatial reasons other than a coordinated network designed to measure global ozone change.In these studies,
A surface radiation budget observing network (SURFRAD) has been established for the United States to support satellite retrieval validation, modeling, and climate, hydrology, and weather research. The primary measurements are the downwelling and upwelling components of broadband solar and thermal infrared irradiance. A hallmark of the network is the measurement and computation of ancillary parameters important to the transmission of radiation. SURFRAD commenced operation in 1995. Presently, it is made up of six stations in diverse climates, including the moist subtropical environment of the U.S. southeast, the cool and dry northern plains, and the hot and arid desert southwest. Network operation involves a rigorous regimen of frequent calibration, quality assurance, and data quality control. An efficient supporting infrastructure has been created to gather, check, and disseminate the basic data expeditiously. Quality controlled daily processed data files from each station are usually available via the Internet within a day of real time. Data from SURFRAD have been used to validate measurements from NASA's Earth Observing System series of satellites, satellite-based retrievals of surface erythematogenic radiation, the national ultraviolet index, and real-time National Environmental Satellite, Data, and Information Service (NESDIS) products. It has also been used for carbon sequestration studies, to check radiative transfer codes in various physical models, for basic research and instruction at universities, climate research, and for many other applications. Two stations now have atmospheric energy flux and soil heat flux instrumentation, making them full surface energy balance sites. It is hoped that eventually all SURFRAD stations will have this capability. 1 • Introduction The National Oceanic and Atmospheric Administration's (NOAA's) Surface Radiation budget network (SURFRAD) is the first of its kind to operate across the United States. The network began in 1995 with four stations and expanded to six in 1998 (Fig. 1). Its mission is to provide the climate research, weather forecasting, satellite, and educational communities with continuous, accurate, high quality surface radiation budget measurements for different climates of the United States. Quality assurance in the station design,
Abstract. International agreements for the limitation of ozone-depleting substances have already resulted in decreases in concentrations of some of these chemicals in the troposphere. Full compliance and understanding of all factors contributing to ozone depletion are still uncertain; however, reasonable expectations are for a gradual recovery of the ozone layer over the next 50 years. Because of the complexity of the processes involved in ozone depletion, it is crucial to detect not just a decrease in ozone-depleting substances but also a recovery in the ozone layer. The recovery is likely to be detected in some areas sooner than others because of natural variability in ozone concentrations. On the basis of both the magnitude and autocorrelation of the noise from Nimbus 7 Total Ozone Mapping Spectrometer ozone measurements, estimates of the time required to detect a fixed trend in ozone at various locations around the world are presented. Predictions from the Goddard Space Flight Center (GSFC) two-dimensional chemical model are used to estimate the time required to detect predicted trends in different areas of the world. The analysis is based on our current understanding of ozone chemistry, full compliance with the Montreal Protocol and its amendments, and no intervening factors, such as major volcanic eruptions or enhanced stratospheric cooling. The results indicate that recovery of total column ozone is likely to be detected earliest in the Southern Hemisphere near New Zealand, southern Africa, and southern South America and that the range of time expected to detect recovery for most regions of the world is between 15 and 45 years. Should the recovery be slower than predicted by the GSFC model, owing, for instance, to the effect of greenhouse gas emissions, or should measurement sites be perturbed, even longer times would be needed for detection.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.