2003
DOI: 10.5194/npg-10-373-2003
|View full text |Cite
|
Sign up to set email alerts
|

Linear and nonlinear post-processing of numerically forecasted surface temperature

Abstract: Abstract. In this paper we test different approaches to the statistical post-processing of gridded numerical surface air temperatures (provided by the European Centre for MediumRange Weather Forecasts) onto the temperature measured at surface weather stations located in the Italian region of Puglia. We consider simple post-processing techniques, like correction for altitude, linear regression from different input parameters and Kalman filtering, as well as a neural network training procedure, stabilised (i.e. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
18
0

Year Published

2005
2005
2022
2022

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 21 publications
(20 citation statements)
references
References 15 publications
2
18
0
Order By: Relevance
“…Perhaps the most severe drawback of MLPs is the possibility that the learning procedure will end in a local minimum of the error function instead of reaching the global one. The seriousness of this problem can be reduced by means of repeated training or by application of a global minimum search technique (such as the one used by Casaioli et al, 2003), but usually at the cost of increased training time. There is also the possibility of overtraining, when the learning procedure runs for too long a time, and the network becomes overoptimized for the description of the cases from the learning set, thus losing its ability to generalize.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Perhaps the most severe drawback of MLPs is the possibility that the learning procedure will end in a local minimum of the error function instead of reaching the global one. The seriousness of this problem can be reduced by means of repeated training or by application of a global minimum search technique (such as the one used by Casaioli et al, 2003), but usually at the cost of increased training time. There is also the possibility of overtraining, when the learning procedure runs for too long a time, and the network becomes overoptimized for the description of the cases from the learning set, thus losing its ability to generalize.…”
Section: Discussionmentioning
confidence: 99%
“…Several studies have been published devoted to downscaling or postprocessing of temperatures by nonlinear methods, mostly MLPs (Trigo and Palutikof, 1999;Schoof and Pryor, 2001;Marzban, 2003;Casaioli et al, 2003) or neural networks based on RBF functions (Weichert and Bürger, 1998). Here, downscaling of the gridded large-scale data was done for the predictand series of daily mean, minimum and maximum temperatures.…”
Section: Downscalingmentioning
confidence: 99%
“…To solve these difficulties different techniques have been applied by different authors, relating the predictions in grid points to real physical sites (Weichert and Bürger, 1998;Schoof and Pryor, 2001;Huth, 2002Huth, , 2004. Several methods were widely used to do the downscaling such as Perfect Prog, Model Output Statistics (MOS) (Wilks, 1995), Artificial Neural Networks (Hsieh et al, 1998) or even Multilinear Regression (Woodcock and Southern, 1983;Casaioli et al, 2003).…”
Section: Introductionmentioning
confidence: 99%
“…The numerical output must usually undergo some post-processing treatment in order to fulfil the specific tasks of interest. Among the applications devised in the context of the project were, for example, the coupling of numerical model precipitation with river flow models for purposes of flood forecasting (Calenda et al, 2000) and the post-processing of surface temperature forecasts for agricultural applications (Casaioli et al, 2003). A first example of the capabilities of the 'sea section' of the system was presented by Bargagli et al (2002), who performed tidal propagation and sea-state hindcasting experiments.…”
Section: Objective Verification Of the System And Applicationsmentioning
confidence: 99%