2021
DOI: 10.1214/21-ejs1838
|View full text |Cite
|
Sign up to set email alerts
|

Sparse random tensors: Concentration, regularization and applications

Abstract: We prove a non-asymptotic concentration inequality for the spectral norm of sparse inhomogeneous random tensors with Bernoulli entries. For an order-k inhomogeneous random tensor T with sparsity pmax ≥) with high probability. The optimality of this bound up to polylog factors is provided by an information theoretic lower bound. By tensor unfolding, we extend the range of sparsity to pmax ≥ c log n n m with 1 ≤ m ≤ k − 1 and obtain concentration inequalities for different sparsity regimes. We also provide a sim… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3

Relationship

2
6

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 52 publications
0
5
0
Order By: Relevance
“…where we use the fact that P ≤ q − 1 and w = 1, and the last inequality is from (80). Since (q − 1)µ i > (q − 1)d for i ∈ [r 0 ], from (82),…”
Section: Structure Of Near Eigenvectorsmentioning
confidence: 99%
See 1 more Smart Citation
“…where we use the fact that P ≤ q − 1 and w = 1, and the last inequality is from (80). Since (q − 1)µ i > (q − 1)d for i ∈ [r 0 ], from (82),…”
Section: Structure Of Near Eigenvectorsmentioning
confidence: 99%
“…Sparse random hypergraphs are fundamental objects in probabilistic combinatorics, but their spectral properties (their adjacency tensors or adjacency matrices) have not been well understood yet. Eigenvalues and the spectral norm for adjacency tensors were considered in [46,55,82,62,31], and the eigenvalue distributions and concentration of the adjacency matrices were studied in [44,65,42,41,61,40]. We believe the non-backtracking operator for hypergraphs could be a promising tool in the study of sparse random tensors and random hypergraphs.…”
Section: Introductionmentioning
confidence: 99%
“…The recent paper [60] analyzed non-uniform hypergraph community detection by using hypergraph embedding and optimization algorithms and obtained weak consistency when the expected degrees are of ω(log n), again a complementary regime to ours. Results on spectral norm concentration of sparse random tensors were obtained in [19,48,29,36,62], but no provable tensor algorithm in the bounded expected degree is known. Testing for the community structure for non-uniform hypergraphs was studied in [56,30], which is a problem different from community detection.…”
Section: 3mentioning
confidence: 99%
“…Hypergraphs can represent more complex relationships among data [9,8], including recommendation systems [11,38], computer vision [26,55], and biological networks [42,53], and they have been shown empirically to have advantages over graphs [61]. Besides community detection problems, sparse hypergraphs and their spectral theory have also found applications in data science [29,62,27], combinatorics [20,22,52], and statistical physics [12,50].…”
Section: Introductionmentioning
confidence: 99%
“…Although random tensors appear frequently in data science problems [5,19,26,36,48,37,41,16,14,28,24,7,46,12,54], a systematic theory of random tensors is still in its infancy.…”
Section: Relaxing Independence?mentioning
confidence: 99%