This work explores the performances of the hydrologic model Hydrotel, applied to 36 catchments located in the Province of Quebec, Canada. A local calibration (each catchment taken individually) scheme and a global calibration (a single parameter set sought for all catchments) scheme are compared in a differential split-sample test perspective. Such a methodology is useful to gain insights on a model’s skills under different climatic conditions, in view of its use for climate change impact studies. The model was calibrated using both schemes on five non-continuous dry and cold years and then evaluated on five dissimilar humid and warm years. Results indicate that, as expected, local calibration leads to better performances than the global one. However, global calibration achieves satisfactory simulations while producing a better temporal robustness (i.e., model transposability to periods with different climatic conditions). Global calibration, in opposition to local calibration, thus imposes spatial consistency to the calibrated parameter values, while locally adjusted parameter sets can significantly vary from one catchment to another due to equifinality. It is hence stated that a global calibration scheme represents a good trade-off between local performance, temporal robustness, and the spatial consistency of parameter values, which is, for example, of interest in the context of ungauged catchments’ simulation, climate change impact studies, or even simply large-scale modeling.
During spring 2011, an extreme flood occurred along the Richelieu River located in southern Quebec, Canada. The Richelieu River is the last section of the complex Richelieu basin, which is composed of the large Lake Champlain located in a valley between two large mountains. Previous attempts in reproducing the Richelieu River flow relied on the use of simplified lumped models and showed mixed results. In order to prepare a tool to assess accurately the change of flood recurrences in the future, a state-of-the-art distributed hydrological model was applied over the Richelieu basin. The model setup comprises several novel methods and data sets such as a very high resolution river network, a modern calibration technique considering the net basin supply of Lake Champlain, a new optimization algorithm, and the use of an up-to-date meteorological data set to force the model. The results show that the hydrological model is able to satisfactorily reproduce the multiyear mean annual hydrograph and the 2011 flow time series when compared with the observed river flow and an estimation of the Lake Champlain net basin supply. Many factors, such as the quality of the meteorological forcing data, that are affected by the low density of the station network, the steep terrain, and the lake storage effect challenged the simulation of the river flow. Overall, the satisfactory validation of the hydrological model allows to move to the next step, which consists in assessing the impacts of climate change on the recurrence of Richelieu River floods.Plain Language Summary In order to study the 2011 Richelieu flood and prepare a tool capable of estimating the effects of climate change on the recurrence of floods, a hydrological model is applied over the Richelieu basin. The application of a distributed hydrological model is useful to simulate the flow of all the tributaries of the Richelieu basin. This new model setup stands out from past models due to its distribution in several hydrological units, its high-resolution river network, the calibration technique, and the high-resolution weather forcing data set used to drive the model. The model successfully reproduced the 2011 Richelieu River flood and the annual hydrograph. The simulation of the Richelieu flow was challenging due to the contrasted elevation of the Richelieu basin and the presence of the large Lake Champlain that acts as a reservoir and attenuates short-term fluctuations. Overall, the application was deemed satisfactory, and the tool is ready to assess the impacts of climate change on the recurrence of Richelieu River floods.
Abstract. Data assimilation is an essential component of any hydrological forecasting system. Its purpose is to incorporate some observations from the field when they become available in order to correct the state variables of the model prior to the forecasting phase. The goal is to ensure that the forecasts are initialized from state variables that are as representative of reality as possible, and also to estimate the uncertainty of the state variables. There are several data assimilation methods, and particle filters are increasingly popular because of their minimal assumptions. The baseline idea is to produce an ensemble of scenarios (i.e. the particles) using perturbations of the forcing variables and/or state variables of the model. The different particles are weighted using the observations when they become available. However, implementing a particle filter over a domain with large spatial dimensions remains challenging, as the number of required particles rises exponentially as the domain size increases. Such a situation is referred to as the “curse of dimensionality”, or a “dimensionality limit”. A common solution to overcome this curse is to localize the particle filter. This consists in dividing the large spatial domain into smaller portions, or “blocks”, and applying the particle filter separately for each block. This can solve the above-mentioned dimensionality problem because it reduces the spatial scale at which each particle filter must be applied. However, it can also cause spatial discontinuities when the blocks are reassembled to form the whole domain. This issue can become even more problematic when additional data are assimilated. The purpose of this study is to test the possibility of remedying the spatial discontinuities of the particles by locally reordering them. We implement a spatialized particle filter to estimate the snow water equivalent (SWE) over a large territory in eastern Canada by assimilating local SWE observations from manual snow surveys. We apply two reordering strategies based on (1) a simple ascending-order sorting and (2) the Schaake shuffle and evaluate their ability to maintain the spatial structure of the particles. To increase the amount of assimilated data, we investigate the inclusion of a second data set (SR50), in which the SWE is indirectly estimated from automatic measurements of snow depth using sonic sensors. The two reordering solutions maintain the spatial structure of the individual particles throughout the winter season, which significantly reduces the spatial random noise in the distribution of the particles and decreases the uncertainty associated with the estimation. The Schaake shuffle proves to be a better tool for maintaining a realistic spatial structure for all particles, although we also found that sorting provides a simpler and satisfactory solution. The assimilation of the secondary data set improved SWE estimates in ungauged sites when compared with the deterministic model, but we noted no significant improvement when both snow courses and the SR50 data were assimilated.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.