2004
DOI: 10.1080/10889860490887464
|View full text |Cite
|
Sign up to set email alerts
|

Better Optimization of Long-Term Monitoring Networks

Abstract: Optimization of long-term monitoring (LTM) networks has changed from a focus on adding information and data, in order to better characterize groundwater plumes, to an emphasis on identifying and removing statistical redundancy, so as to minimize long-term costs.An optimal system is defined as one with a minor loss of information, but a large gain in cost savings. Better optimization strategies offer improvements in 1) measuring both costs and the accuracy of baseline estimates, 2) choosing optimal subsets of t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2006
2006
2021
2021

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(11 citation statements)
references
References 11 publications
0
11
0
Order By: Relevance
“…This report presents an optimization analysis of waterquality data collected from selected wells completed in the ESRP aquifer at and near the INL to identify and remove redundancy in the existing monitoring network, for the purpose of reducing LTM costs while incurring a minimal loss of statistical information. Redundancy is defined by Cameron (2004) as the ability of a reduced-dataset to accurately reconstruct features or characteristics that were estimated from the full-dataset. The cost savings derived from the removal of sampling sites (or locations) from the existing network, or reduction in sampling frequency, is realized by not collecting (and analyzing) the additional water samples.…”
Section: Purpose and Scopementioning
confidence: 99%
See 2 more Smart Citations
“…This report presents an optimization analysis of waterquality data collected from selected wells completed in the ESRP aquifer at and near the INL to identify and remove redundancy in the existing monitoring network, for the purpose of reducing LTM costs while incurring a minimal loss of statistical information. Redundancy is defined by Cameron (2004) as the ability of a reduced-dataset to accurately reconstruct features or characteristics that were estimated from the full-dataset. The cost savings derived from the removal of sampling sites (or locations) from the existing network, or reduction in sampling frequency, is realized by not collecting (and analyzing) the additional water samples.…”
Section: Purpose and Scopementioning
confidence: 99%
“…Optimizing LTM networks with the aim of removing data from the system because they add little to no beneficial information has received increasing attention in the recent past. An in-depth description of the optimization problem was provided by Cameron (2004). Previous efforts to eliminate redundancy in existing groundwater monitoring networks have separately examined the temporal and spatial components of redundancy.…”
Section: Previous Investigationsmentioning
confidence: 99%
See 1 more Smart Citation
“…Approaches for temporal (that is, sampling frequency) optimization of groundwater observation networks have been presented in several studies (Cameron and Hunter, 2002;Aziz and others, 2003;Cameron, 2004;Herrera and Pinder, 2005;Thakur, 2015). These techniques typically try to reduce a network's temporal redundancy by using trend analysis, Kalman filters, or temporal variogram analysis.…”
Section: Background and Previous Investigationsmentioning
confidence: 99%
“…These techniques typically try to reduce a network's temporal redundancy by using trend analysis, Kalman filters, or temporal variogram analysis. Multiple researchers have invoked iterative data-thinning schemes in conjunction with trend analysis to evaluate sampling frequency (Cameron and Hunter, 2002;Cameron, 2004;Thakur, 2015). The iterative thinning approaches often use Sen's (1968) method or local regression for estimating trends and consist of the following components: (1) estimating the trend for the entire time series at a well, (2) iteratively thinning, at random, the time series by subsampling the time series, (3) estimating the trend for the thinned time series, and (4) comparing the trends from the entire and thinned time series.…”
Section: Background and Previous Investigationsmentioning
confidence: 99%