2010
DOI: 10.1002/ett.1440
|View full text |Cite
|
Sign up to set email alerts
|

Employing Volterra filters in the ADPCM technique for speech coding: a comprehensive investigation

Abstract: Although linear filters are useful in a various applications in the context of speech processing, there are several evidences for existence of nonlinearity in speech signals. Our main aim is to launch a comprehensive investigation into the exploitation of nonlinear Volterra filters in the context of the ADPCM-based speech coding technique, using two methods of forward prediction, based on the LS criterion, and backward prediction, based on both LMS and RLS adaptation algorithms. In any case, after solving some… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
11
0

Year Published

2012
2012
2020
2020

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(13 citation statements)
references
References 13 publications
2
11
0
Order By: Relevance
“…The best way to reduce the bit rate without affecting the quantization quality is the use of vector quantization (VQ) [27][28][29][30][31].…”
Section: Performance Of the Differential Scalar Quantizationmentioning
confidence: 99%
“…The best way to reduce the bit rate without affecting the quantization quality is the use of vector quantization (VQ) [27][28][29][30][31].…”
Section: Performance Of the Differential Scalar Quantizationmentioning
confidence: 99%
“…In practice the normalized version of this algorithm, that is, the normalized LMS algorithm, is used to alleviate the dependency of the LMS adaptive filter on the statistics of the input signal. This algorithm is usually utilized in the context of ADPCM speech coding where the weight update equation is wi=wi1+μxietrue⌢(i)σ̃i2 0 < μ ≪ 1 is the convergence parameter to control the memory span of the predictor filter and therefore the convergence speed of the algorithm and trueσ̃i2 is an estimate of the input signal variance.…”
Section: Kernel Adaptive Filtering Algorithms With Online Sparsificationmentioning
confidence: 99%
“…The exponentially weighted recursive least‐squares (EW‐RLS) algorithm finds the solution of a deterministic optimization problem by way of minimizing the exponentially weighted cost function expressed at instant i as J(i)=140%trueik=1λik(s(k)wiTxk)2 0 < < λ ⩽ 1 is the weighting factor, also called the forgetting factor, controlling the memory span of the prediction. Using the matrix inversion lemma, one can derive a recursive realization of the EW‐RLS algorithm in backward speech signal prediction .…”
Section: Kernel Adaptive Filtering Algorithms With Online Sparsificationmentioning
confidence: 99%
See 2 more Smart Citations