2019
DOI: 10.1109/tsp.2019.2919415
|View full text |Cite
|
Sign up to set email alerts
|

On Quasi-Isometry of Threshold-Based Sampling

Abstract: The problem of isometry for threshold-based sampling such as integrate-andfire (IF) or send-on-delta (SOD) is addressed. While for uniform sampling the Parseval theorem provides isometry and makes the Euclidean metric canonical, there is no analogy for threshold-based sampling. The relaxation of the isometric postulate to quasi-isometry, however, allows the discovery of the underlying metric structure of threshold-based sampling. This paper characterizes this metric structure making Hermann Weyl's discrepancy … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 43 publications
0
7
0
Order By: Relevance
“…It augments the computational complexity while comparing with the suggested solution. The adaptive rate signal collection, in contrast to the conventional approach, implicitly gives information about the signal frequency contents in time-domain [23,28]. It enables adequate feature mining to be realized only through time-domain analysis.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…It augments the computational complexity while comparing with the suggested solution. The adaptive rate signal collection, in contrast to the conventional approach, implicitly gives information about the signal frequency contents in time-domain [23,28]. It enables adequate feature mining to be realized only through time-domain analysis.…”
Section: Resultsmentioning
confidence: 99%
“…The ASA is a novel approach which permits to exploit the important information of the signal without any computationally complex transformation. Therefore, in the studied case, the classifiable features are mined directly in time-domain without involving any complex transformation scheme [23].…”
Section: Features Extractionmentioning
confidence: 99%
“…1. This can be explained by a limited use of the analytical information contained in (5). An early observation is that little is done in these methods to make the bandlimited estimates consistent with this information.…”
Section: Reconstruction By Pocsmentioning
confidence: 99%
“…1). Time encoding is attracting more and more interest in data acquisition [1], [2], [3], [4], [5], [6], [7], [8], [7], [9], [10] as the downscaling of semiconductor integration is increasing time precision while resulting in less amplitude accuracy [11], [12], [13]. This is also part of the trend on event-based sampling [14] with more general objectives such as having acquisition activity dependent on input activity for power efficiency.…”
Section: Introductionmentioning
confidence: 99%
“…It captures and processes redundant samples that increase the system's overall computational load, power consumption, processing, and transmission activities [27,29]. These shortfalls can be compensated for to a certain extent by using level-crossing analog-to-digital converters (LCADCs) [16,24,[30][31][32][33][34][35]. These converters adapt the system acquisition and processing rates according to the incoming signal's temporal variations, a process that renders a significant computational efficiency of the suggested approach compared to traditional ones.…”
Section: Introductionmentioning
confidence: 99%