Core Ideas
New US soil moisture network provides new opportunity to evaluate drought
Soil moisture at deeper depths did not fully recover from the 2012 drought in 2013
Soil moisture varied by region in the response to drought
The NOAA United States Climate Reference Network (USCRN) deployed soil moisture sensors during 2009 to 2011 to monitor the temporal and spatial variability of soil moisture at 114 locations in the contiguous United States. These new soil observations will enhance our understanding of changing soil conditions for better drought monitoring. One year after full deployment of the network, a large drought occurred across most of the United States and provided an opportunity to evaluate the utility of this network for drought monitoring. The soil moisture signal of the 2012 drought in the continental United States was detected nationally at all observational depths (5, 10, 20, 50, and 100 cm), with an overall 11.07% decrease from the average of the 2011 to 2013 summers. The top three depths (5, 10, and 20 cm) experienced the largest decrease in soil moisture. Although 2013 national precipitation totals returned to normal values and national soil moisture levels recovered from the 2012 drought, the national average soil moisture concentrations combined at the 50‐ and 100‐cm depths remained around 18% below pre‐drought levels. Regional analysis of the 2012 drought identified that the Upper Midwest, Northeast, Northern Rockies and Plains, and Ohio Valley climate regions were most impacted and demonstrated a temporal pattern similar to the national analysis. These results demonstrate the utility of using USCRN for monitoring national soil moisture conditions, assessing droughts, and tracking climate change with time.
The main challenge of evaluating droughts in the context of climate change and linking these droughts to adverse societal outcomes is a lack of a uniform definition that identifies drought conditions at a location and time. The U.S. Drought Monitor (USDM), created in 1999, is a well-established composite index that combines drought indicators across the hydrological cycle (i.e., meteorological to hydrological) with information from local experts. This makes the USDM one of the most holistic measures for evaluating past drought conditions across the United States. In this study, the USDM was used to define drought events as consecutive periods in time where the USDM status met or exceeded D1 conditions over the past 20 years. This analysis was applied to 5 km grid cells covering the U.S. and Puerto Rico to characterize the frequency, duration, and intensification rates of drought, and the timing of onset, amelioration, and other measures for every drought event on record. Results from this analysis revealed stark contrasts in the evolution of drought across the United States. Over the western United States, droughts evolved much slower, resulting in longer-lasting but fewer droughts. The eastern United States experienced more frequent, shorter-duration events. Given the slower evolution from onset to drought peak, flash droughts, which made up 9.8% of all droughts, were less common across the western United States, with a greater frequency over the southern United States. The most severe drought event on record was the 2012 drought, when more than 21% of the United States experienced its largest number of weeks at or above extreme (D3) drought conditions. The availability of historical drought events would support future societal impacts studies relating drought to adverse outcomes and aid in the evaluation of mitigation strategies by providing a dataset to local decision makers to compare and evaluate past droughts.
The air-freezing index (AFI) is a common metric for determining the freezing severity of the winter season and estimating frost depth for midlatitude regions, which is useful for determining the depth of shallow foundation construction. AFI values represent the seasonal magnitude and duration of below-freezing air temperature. Departures of the daily mean temperature above or below 0°C (32°F) are accumulated over each August–July cold season; the seasonal AFI value is defined as the difference between the highest and lowest extrema points. Return periods are computed using generalized extreme value distribution analysis. This research replaces the methodology used by the National Oceanic and Atmospheric Administration to calculate AFI return periods for the 1951–80 time period, applying the new methodology to the 1981–2010 climate normals period. Seasonal AFI values and return period values were calculated for 5600 stations across the coterminous United States (CONUS), and the results were validated using U.S. Climate Reference Network temperature data. Return period values are typically 14%–18% lower across CONUS during 1981–2010 versus a recomputation of 1951–80 return periods with the new methodology. For the 100-yr (2 yr) return periods, about 59% (83%) of stations show a decrease of more than 10% in the more recent period, whereas 21% (2%) show an increase of more than 10%, indicating a net reduction in winter severity that is consistent with observed climate change.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.