2018
DOI: 10.48550/arxiv.1812.07966
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Determinantal conditions for homomorphic sensing

Abstract: Given two endomorphisms τ 1 , τ 2 of C m , we provide eigenspace conditions under which τwhere V is a general n-dimensional subspace of C m for some n ≤ m/2. As a special case, we show that these eigenspace conditions are true when the endomorphisms are permutations composed with coordinate projections, leading to an abstract proof of the recent unlabeled sensing theorem of [21].

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
28
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
5
3

Relationship

3
5

Authors

Journals

citations
Cited by 12 publications
(28 citation statements)
references
References 14 publications
0
28
0
Order By: Relevance
“…The works [2,44] discuss alternating minimization as well as the use of the Expectation-Maximization (EM) Algorithm (combined with MCMC sampling) in which Π * constitutes missing data. The recent paper [38] discusses a branch-and-bound scheme with promising empirical performance on small data sets; the theoretical properties of the approaches [2,38,44] remain to be investigated.…”
Section: Introductionmentioning
confidence: 99%
“…The works [2,44] discuss alternating minimization as well as the use of the Expectation-Maximization (EM) Algorithm (combined with MCMC sampling) in which Π * constitutes missing data. The recent paper [38] discusses a branch-and-bound scheme with promising empirical performance on small data sets; the theoretical properties of the approaches [2,38,44] remain to be investigated.…”
Section: Introductionmentioning
confidence: 99%
“…The oversampling factor of 2 was shown to be sufficient as well in the more challenging case of unlabeled sensing where only a subvector of y is known [3]. These results were then generalized to arbitrary linear transformations beyond permutations and down-samplings by [4] and [5], see also [6]. Bringing back the noise ǫ into the picture [7] obtained SNR conditions under which recovery of Π * is possible from the maximal likelihood estimator…”
Section: Introductionmentioning
confidence: 93%
“…In [12], the authors characterized a necessary condition on the dimension of the observation vector for uniquely recovering the original data in the noiseless case. A generalized framework of unlabeled sensing was presented in [15]- [17]. The estimation of a sorted vector based on noisy observations was proposed in [18], where the MMSE estimator on sorted data was characterized as a linear combination of estimators on the unsorted data.…”
Section: A Related Workmentioning
confidence: 99%