This study determines the aspects of river bathymetry that have the greatest influence on the predictive biases when simulating hyporheic exchange. To investigate this, we build a highly parameterized HydroGeoSphere model of the Steinlach River Test Site in southwest Germany as a reference. This model is then modified with simpler bathymetries, evaluating the changes to hyporheic exchange fluxes and transit time distributions. Results indicate that simulating hyporheic exchange with a high-resolution detailed bathymetry using a three-dimensional fully coupled model leads to nested multiscale hyporheic exchange systems. A poorly resolved bathymetry will underestimate the small-scale hyporheic exchange, biasing the simulated hyporheic exchange towards larger scales, thus leading to overestimates of hyporheic exchange residence times. This can lead to gross biases in the estimation of a catchment's capacity to attenuate pollutants when extrapolated to account for all meanders along an entire river within a watershed. The detailed river slope alone is not enough to accurately simulate the locations and magnitudes of losing and gaining river reaches. Thus, local bedforms in terms of bathymetric highs and lows within the river are required. Bathymetry surveying campaigns can be more effective by prioritizing bathymetry measurements along the thalweg and gegenweg of a meandering channel. We define the gegenweg as the line that connects the shallowest points in successive cross-sections along a river opposite to the thalweg under average flow conditions. Incorporating local bedforms will likely capture the nested nature of hyporheic exchange, leading to more physically meaningful simulations of hyporheic exchange fluxes and transit times.
2[0000−0002−5100−1519] , Hsu-Chun Hsiao 3[0000−0001−9592−6911] , Daniele E. Asoni 4[0000−0001−5699−9237] , Simon Scherrer 4[0000−0001−9557−1700] , Adrian Perrig 4[0000−0002−5280−5412] , and Yih-Chun Hu 1[0000−0002−7829−3929]Abstract. The detection of network flows that send excessive amounts of traffic is of increasing importance to enforce QoS and to counter DDoS attacks. Large-flow detection has been previously explored, but the proposed approaches can be used on high-capacity core routers only at the cost of significantly reduced accuracy, due to their otherwise too high memory and processing overhead. We propose CLEF, a new large-flow detection scheme with low memory requirements, which maintains high accuracy under the strict conditions of high-capacity core routers. We compare our scheme with previous proposals through extensive theoretical analysis, and with an evaluation based on worst-case-scenario attack traffic. We show that CLEF outperforms previously proposed systems in settings with limited memory.Keywords: Large-flow detection, damage metric, memory and computation efficiency 5 As in prior literature [15,42], the term large flow denotes a flow that sends more than its allocated bandwidth. arXiv:1807.05652v1 [cs.NI] 16 Jul 2018 (2) * c 168.6 63.75 31.92 26.56 21.96 10.68 7.59 * Time unit is second.
Many networking and security applications can benefit from exact detection of large flows over arbitrary windows (i.e. any possible time window). Existing large flow detectors that only check the average throughput over certain time period cannot detect bursty flows and are therefore easily fooled by attackers. However, no scalable approaches provide exact classification in one pass. To address this challenge, we consider a new model of exactness outside an ambiguity region, which is defined to be a range of bandwidths below a high-bandwidth threshold and above a low-bandwidth threshold. Given this new model, we propose a deterministic algorithm, EARDet, that detects all large flows (including bursty flows) and avoids false accusation against any small flows, regardless of the input traffic distribution. EARDet monitors flows over arbitrary time windows and is built on a frequent items finding algorithm based on average frequency. Despite its strong properties, EARDet has low storage overhead regardless of input traffic and is surprisingly scalable because it focuses on accurate classification of large flows and small flows only. Our evaluations confirm that existing approaches suffer from high error rates (e.g., misclassifying 1% of small flows as large flows) in the presence of large flows and bursty flows, whereas EARDet can accurately detect both at gigabit line rate using a small amount of memory that fits into on-chip SRAM.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.