2021
DOI: 10.1016/j.neunet.2021.08.020
|View full text |Cite
|
Sign up to set email alerts
|

Extremely randomized neural networks for constructing prediction intervals

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
12
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 9 publications
(12 citation statements)
references
References 46 publications
0
12
0
Order By: Relevance
“…The concept of MC dropout is central to this novel literature on prediction intervals for NN models, see Srivastava et al (2014). An alternative to construct prediction intervals in a NN framework, recently proposed in Mancini et al (2021), explores the results for randomised trees and derives valid confidence intervals of the model predictions in finite samples. In this work, we focus on the latter two methodologies that are reviewed in the following subsections.…”
Section: Prediction Intervals For Multi-layer Nnsmentioning
confidence: 99%
See 4 more Smart Citations
“…The concept of MC dropout is central to this novel literature on prediction intervals for NN models, see Srivastava et al (2014). An alternative to construct prediction intervals in a NN framework, recently proposed in Mancini et al (2021), explores the results for randomised trees and derives valid confidence intervals of the model predictions in finite samples. In this work, we focus on the latter two methodologies that are reviewed in the following subsections.…”
Section: Prediction Intervals For Multi-layer Nnsmentioning
confidence: 99%
“…The EN network approach can be interpreted as an ensemble predictor for NN models. This methodology, formally introduced in Mancini et al (2021), is based on the extremely randomised trees approach proposed by Geurts et al (2006) for random forests.…”
Section: Prediction Intervals For Multi‐layer Nnsmentioning
confidence: 99%
See 3 more Smart Citations