2019
DOI: 10.48550/arxiv.1906.08324
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

The Functional Neural Process

Christos Louizos,
Xiahan Shi,
Klamer Schutte
et al.

Abstract: We present a new family of exchangeable stochastic processes, the Functional Neural Processes (FNPs). FNPs model distributions over functions by learning a graph of dependencies on top of latent representations of the points in the given dataset. In doing so, they define a Bayesian model without explicitly positing a prior distribution over latent global parameters; they instead adopt priors over the relational structure of the given dataset, a task that is much simpler. We show how we can learn such models fr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 18 publications
0
5
0
Order By: Relevance
“…In this section, we demonstrated the performance of the proposed priors on a 1D test function, as used in [Louizos et al, 2019]. We compare to BNN with independent Gaussian priors and a mean-field Gaussian variational approximation, and MetaNN [Karaletsos et al, 2018].…”
Section: B2 Results On a Synthetic Regression Examplementioning
confidence: 99%
“…In this section, we demonstrated the performance of the proposed priors on a 1D test function, as used in [Louizos et al, 2019]. We compare to BNN with independent Gaussian priors and a mean-field Gaussian variational approximation, and MetaNN [Karaletsos et al, 2018].…”
Section: B2 Results On a Synthetic Regression Examplementioning
confidence: 99%
“…On the inference side, there are attentive NPs , which endow the encoder with self-attention (and thus make it Turing complete, Pérez et al, 2021), and convolutional (conditional) NPs (Gordon et al, 2019;Foong et al, 2020), which add translation equivariance to the model. On the generative side, there are functional NPs (Louizos et al, 2019), which introduce dependence between the predictions by learning a relational graph structure over the latents z and Gaussian NPs (Bruinsma et al, 2021), which achieve a similar property by replacing the generative likelihood with a GP, the mean and kernel of which are inferred based on the latents.…”
Section: Neural Processesmentioning
confidence: 99%
“…NGPs (Neural gaussian Process) [11,12,18] are novel class of probabilistic neural models for accurate and calibrated predictions. NGP models such as EPIFNP [12] and its variants [11] have shown state-of-art performance in calibrated probabilistic forecasting across multiple benchmarks for single time-series.…”
Section: Raw Forecast Distributions From Ngpsmentioning
confidence: 99%