2019
DOI: 10.48550/arxiv.1910.13556
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Convolutional Conditional Neural Processes

Abstract: We introduce the Convolutional Conditional Neural Process (CONVCNP), a new member of the Neural Process family that models translation equivariance in the data. Translation equivariance is an important inductive bias for many learning problems including time series modelling, spatial data, and images. The model embeds data sets into an infinite-dimensional function space as opposed to a finitedimensional vector space. To formalize this notion, we extend the theory of neural representations of sets to include f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
12
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 10 publications
(14 citation statements)
references
References 15 publications
0
12
0
Order By: Relevance
“…A later study [32] proposed Attentive Neural Processes (ANPs) by incorporating attention into NPs to alleviate the underfitting problem and improve the regression performance. NP-based models have shown great performance in function regression [33], image reconstruction [32], and point-cloud modeling [34]. As probabilistic latent variable models, ANPs naturally enable continual online learning in continuously parameterized environments.…”
Section: Related Workmentioning
confidence: 99%
“…A later study [32] proposed Attentive Neural Processes (ANPs) by incorporating attention into NPs to alleviate the underfitting problem and improve the regression performance. NP-based models have shown great performance in function regression [33], image reconstruction [32], and point-cloud modeling [34]. As probabilistic latent variable models, ANPs naturally enable continual online learning in continuously parameterized environments.…”
Section: Related Workmentioning
confidence: 99%
“…For both variables the model produces qualitatively realistic time-series. This model is an example of a more general class of models known as convolutional conditional neural processes (ConvC-NPs; Gordon et al, 2019). Formally, PP downscaling is an instance of a supervised learning problem to learn the function f in equation 1.…”
Section: Inclusion Of Sub-grid Scale Topographymentioning
confidence: 99%
“…In the context of spatial data such as downscaling predictions, a desirable characteristic of predictions are that they are translationequivariant, that is the model makes identical predictions if data are spatially translated. The convCNP model (Gordon et al, 2019) applied here builds this equivariance into the CNP model. Throughout this study, we refer to the model developed in this section as 'the convCNP model'.…”
Section: Inclusion Of Sub-grid Scale Topographymentioning
confidence: 99%
“…Symmetry is mathematically described in terms of groups and has become an essential concept in machine learning. Gordon et al (2019) point out that, when data symmetry is represented by a infinite group like the translation group, equivariant maps, which are symmetry-preserving processing, cannot be captured as maps between finite-dimensional spaces but can be described by maps between infinite-dimensional function spaces. As a related study about symmetry-preserving processing, Finzi et al (2020) propose group convolution of functional representations and investigate practical computational methods such as discretization and localization.…”
Section: Related Workmentioning
confidence: 99%