2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2020
DOI: 10.1109/cvpr42600.2020.01456
|View full text |Cite
|
Sign up to set email alerts
|

HCNAF: Hyper-Conditioned Neural Autoregressive Flow and its Application for Probabilistic Occupancy Map Forecasting

Abstract: We introduce Hyper-Conditioned Neural Autoregressive Flow (HCNAF); a powerful universal distribution approximator designed to model arbitrarily complex conditional probability density functions. HCNAF consists of a neural-net based conditional autoregressive flow (AF) and a hyper-network that can take large conditions in nonautoregressive fashion and outputs the network parameters of the AF. Like other flow models, HCNAF performs exact likelihood inference. We demonstrate the effectiveness and attributes of HC… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
19
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 9 publications
(19 citation statements)
references
References 22 publications
0
19
0
Order By: Relevance
“…Furthermore, a group of generative models that are used to model a conditional distribution p(x o |x i ) are called conditional generative models. Examples include conditional VAE [18], conditional GAN [19], and conditional flow [5].…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Furthermore, a group of generative models that are used to model a conditional distribution p(x o |x i ) are called conditional generative models. Examples include conditional VAE [18], conditional GAN [19], and conditional flow [5].…”
Section: Related Workmentioning
confidence: 99%
“…Normalizing flow, or Flow, is also a type of explicit generative model. While flow-based models [4,5,22] have access to the exact probability density p(x), flow-based models are typically better suited for density estimations than generative tasks (e.g., trajectory forecasting).…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Conventionally, conditioning information is injected into networks via addition [20], concatenation [21], or adaptive instance normalization [22]. Conversely, HC learns transformations of conditioning signals which ultimately control the underlying model weights (for instance, the coefficients of a convolutional kernel) [12]. This type of adaptation is more direct and expressive, validating its use in recent audio style transfer applications [13].…”
Section: Hyperconditioned Blocksmentioning
confidence: 99%
“…We continue the trend towards lightweight, interpretable audio deep learning models, using several IIR filters as building blocks. Unlike previous attempts to learn trainable IIR filters using deep learning [9,11], we use hyperconditioning (HC) [12,13] in order to capture changes in filtering stages based on the user-facing parameters controlling the effect. We consider many cascaded biquad (CB) representations and introduce activation functions necessary for their stable training.…”
Section: Introductionmentioning
confidence: 99%