2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP '03).
DOI: 10.1109/icassp.2003.1202774
|View full text |Cite
|
Sign up to set email alerts
|

On the importance of exact synchronization for distributed audio signal processing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
33
0

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 42 publications
(33 citation statements)
references
References 5 publications
0
33
0
Order By: Relevance
“…Digital signal processing tools that take audio from multiple different transducers as their input, such as blind source separation (BSS) and acoustic echo cancellation (AEC), will not work as expected if these audio streams are not synchronized [2]. To use these tools with audio streams generated by distributed devices it is necessary to correct the mismatches in sampling rate.…”
Section: Introductionmentioning
confidence: 98%
See 2 more Smart Citations
“…Digital signal processing tools that take audio from multiple different transducers as their input, such as blind source separation (BSS) and acoustic echo cancellation (AEC), will not work as expected if these audio streams are not synchronized [2]. To use these tools with audio streams generated by distributed devices it is necessary to correct the mismatches in sampling rate.…”
Section: Introductionmentioning
confidence: 98%
“…The necessary resampling factors may take arbitrary values and may change with time, so a traditional resampling approach using cascaded decimators and interpolators is not practical for this. Instead, interpolation filters are typically used [4,2,3]. This still presents computational challenges, due to the need of a different interpolation filter for each output sample.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The first problem can be dealt with by connecting the microphones to much cheaper single-channel A/D converters, however the result is desynchronization of the recorded signals. Lienhart et al proposed to synchronize the recording devices over a network [9]. Another solution has been proposed by Ono et al, who developed a method to jointly estimate the microphone locations, the single source location and the time origins of the recording devices [8,10].…”
Section: Introductionmentioning
confidence: 99%
“…al. [4] developed a system to synchronize the audio signals by having the individual microphone devices to send special synchronization signals over a dedicated link. Raykar et.…”
Section: Introductionmentioning
confidence: 99%