2022
DOI: 10.1002/env.2780
|View full text |Cite
|
Sign up to set email alerts
|

REDS: Random ensemble deep spatial prediction

Abstract: There has been a great deal of recent interest in the development of spatial prediction algorithms for very large datasets and/or prediction domains. These methods have primarily been developed in the spatial statistics community, but there has been growing interest in the machine learning community for such methods, primarily driven by the success of deep Gaussian process regression approaches and deep convolutional neural networks. These methods are often computationally expensive to train and implement and … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(6 citation statements)
references
References 66 publications
0
6
0
Order By: Relevance
“…One possible direction is to utilize the connection between wide, deep networks and Gaussian processes, as proposed byLee et al (2017), and apply Bayesian inference to construct a credit interval for prediction. Alternatively, Daw and Wikle (2022) suggested using random feature expansions in a hierarchical manner for uncertainty quantification. Another option is to train an NN using “Bayes by Backprop” as suggested by Blundell et al (2015) within the context of variational inference.…”
Section: Conclusion and Discussionmentioning
confidence: 99%
“…One possible direction is to utilize the connection between wide, deep networks and Gaussian processes, as proposed byLee et al (2017), and apply Bayesian inference to construct a credit interval for prediction. Alternatively, Daw and Wikle (2022) suggested using random feature expansions in a hierarchical manner for uncertainty quantification. Another option is to train an NN using “Bayes by Backprop” as suggested by Blundell et al (2015) within the context of variational inference.…”
Section: Conclusion and Discussionmentioning
confidence: 99%
“…Temporal, spatial and spatio‐temporal models are central to the vast majority of contributions to the issue. Lowther et al (2023) consider multiple time series data that contain change points, and showcase their methods on data on the Greenland ice sheet; Kleiber et al (2023) consider the problem of modeling and simulating tropical cyclone precipitation fields using a spatio‐temporal model in polar coordinates; Shirota et al (2023) tackle the problem of fitting spatial models to light detection and ranging (LiDAR) data collected over Alaska; Abdulah et al (2023) consider the spatial analysis of sea‐surface temperature data; Jurek and Katzfuss (2023) the spatio‐temporal analysis of total precipitable water; Daw and Wikle (2023) the spatial analysis of satellite temperature data; Ning et al (2023) the spatial analysis of presence‐absence ecological data; and the discussion by Rougier et al (2023) focuses on the challenges of fitting spatio‐temporal models to environmental data. The large number of contributed papers involving these classes of models is not coincidental, as many of the phenomena that are analyzed in EDS are temporal, spatial or spatio‐temporal in nature.…”
Section: Statistical Temporal Spatial and Spatio‐temporal Modelingmentioning
confidence: 99%
“…The special issue reflects the increased adoption of techniques commonly associated with the field of machine learning or artificial intelligence (AI) by statisticians working in EDS. Kleiber et al (2023) use random forests to model basis‐function coefficients of a spatio‐temporal process, while Daw and Wikle (2023) develop an approach based on an ensemble of deep learning models, known as extreme learning machines, for spatial prediction and uncertainty quantification of the predictions. A common criticism of deep learning models is that they are not interpretable; the rise of explainable AI and how this is applied in the context of EDS is discussed at length by the working group “AI methods in Environmental Science” of The International Environmetrics Society (TIES) in Wikle et al (2023).…”
Section: Machine Learning and Artificial Intelligencementioning
confidence: 99%
“…As indicated in Section 2, uncertainty quantification and the selection of hyperparameters in this model must be accounted for. We quantify forecast uncertainty through ensembles, but our method is motivated by the highest density region proposed by Hyndman (1996) and calibration strategy of Daw and Wikle (2022) as described in Section 3.2.…”
Section: Nested Esn Modelmentioning
confidence: 99%
“…The motivation for ( 12) is from the calibration strategy of Daw and Wikle (2022). To quantify the uncertainty for spatial prediction in reservoir models, Daw and Wikle (2022) first obtained the median and inter-quartile range from ensembles and trained the optimal calibration cutoff value to ensure a (1 − α)% coverage rate over the training set.…”
Section: Uncertainty Quantificationmentioning
confidence: 99%