2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2014
DOI: 10.1109/icassp.2014.6855146
|View full text |Cite
|
Sign up to set email alerts
|

Introducing Legendre nonlinear filters

Abstract: This paper introduces a novel sub-class of linear-in-the-parameters nonlinear filters, the Legendre nonlinear filters. Their basis functions are polynomials, specifically, products of Legendre polynomial expansions of the input signal samples. Legendre nonlinear filters share many of the properties of the recently introduced classes of Fourier nonlinear filters and even mirror Fourier nonlinear filters, which are based on trigonometric basis functions. In fact, Legendre nonlinear filters are universal approxim… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
16
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
4
2

Relationship

3
3

Authors

Journals

citations
Cited by 19 publications
(16 citation statements)
references
References 27 publications
0
16
0
Order By: Relevance
“…In addition to local disturbances, conventional AEC algorithms are also impeded by nonlinearities in the echo path due to nonlinear components such as miniaturized loudspeakers. This can be addressed by modelling the echo path using nonlinear structures such as Hammerstein models [10,11,12]. Moreover, linear echo cancellation is often limited due to the use of insufficiently long finite impulse response (FIR) filters.…”
Section: Introductionmentioning
confidence: 99%
“…In addition to local disturbances, conventional AEC algorithms are also impeded by nonlinearities in the echo path due to nonlinear components such as miniaturized loudspeakers. This can be addressed by modelling the echo path using nonlinear structures such as Hammerstein models [10,11,12]. Moreover, linear echo cancellation is often limited due to the use of insufficiently long finite impulse response (FIR) filters.…”
Section: Introductionmentioning
confidence: 99%
“…In the third experiment, two data sets proposed for benchmarking in nonlinear system identification [52] and composed of data recorded on real nonlinear systems are used to assess the performance of the proposed filter. The characteristics, the computational complexity and the performance of the CN filters are carefully analyzed in comparison to those of the wellknown Volterra filters, the EMFN filters described in [29,30], and the LN filters introduced in [31,32]. The perfect sequences for EMFN and LN filters, used in the second experiment, have been derived in [35,36] and [32,37], respectively, and are available in [53].…”
Section: Resultsmentioning
confidence: 99%
“…Recently, the finite-memory LIP class has been enriched with novel sub-classes of nonlinear filters that guarantee the orthogonality of the basis functions for white uniform input signals in the range ½À1; þ1: the Fourier nonlinear (FN) filters [28,29], the even mirror Fourier nonlinear (EMFN) filters [29,30], and the Legendre nonlinear (LN) filters [31,32]. FN and EMFN filters are based on trigonometric function expansions of the input signal samples, and do not include a linear term among the basis functions.…”
Section: Introductionmentioning
confidence: 99%
“…It has been shown in [6], [7], [9], [10] that EMFN and Legendre filters are universal approximators for the input-output relationship of discrete-time, time-invariant, finite-memory, Table 1. Basis functions of EMFN filters Order 1: Table 2.…”
Section: Basic Notions On Emfn and Legendre Filtersmentioning
confidence: 99%
“…To overcome this difficulty, two new LIP nonlinear filters have been recently introduced that replicate the construction rule of Volterra filters but use different basis functions. They are the even mirror Fourier nonlinear (EMFN) filter [6], [7], [8], based on even mirror symmetric trigonometric basis functions, and the Legendre filter [9], [10], based on Legendre polynomials. Both filters are universal approximators, in the sense specified above.…”
Section: Introductionmentioning
confidence: 99%