2022
DOI: 10.1109/jstsp.2022.3203608
|View full text |Cite
|
Sign up to set email alerts
|

Autoregressive Predictive Coding: A Comprehensive Study

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 51 publications
0
4
0
Order By: Relevance
“…Both CPC-big and CPC-small are taken from the Zero Speech 2021 baseline 3 . We use our own implementation of APC, following [15]. The CPC and APC models differ in several ways.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Both CPC-big and CPC-small are taken from the Zero Speech 2021 baseline 3 . We use our own implementation of APC, following [15]. The CPC and APC models differ in several ways.…”
Section: Methodsmentioning
confidence: 99%
“…We follow recent work in speech technology and machine learning in using the term predictive coding to refer to error-driven learning based on forward prediction, contrasting with masked prediction or other objectives. In the more general sense used in information theory[14,15] and computational neuroscience[16,17,18], masked prediction can also be viewed as a type of predictive coding.…”
mentioning
confidence: 99%
“…Following [14,6], we fix the time shift k to 5, choosing the recurrent network as 3-layer 512-dimensional unidirectional LSTMs. We will compare training the model with VQ-APC and neural HMMs.…”
Section: Methodsmentioning
confidence: 99%
“…where W is a linear projection, V is the codebook, and d is the dimension of x t . We simply choose W the identity matrix, setting the codeword dimension to d. The choice of Gaussian is aligned with the L 2 loss in [5,14]. Note that the parametrization does not introduce any additional parameters compared to VQ-APC.…”
Section: Neural Parametrizationmentioning
confidence: 99%