2020
DOI: 10.1002/rsa.20909
|View full text |Cite
|
Sign up to set email alerts
|

On the discrepancy of random matrices with many columns

Abstract: Motivated by the Komlós conjecture in combinatorial discrepancy, we study the discrepancy of random matrices with m rows and n independent columns drawn from a bounded lattice random variable. It is known that for n tending to infinity and m fixed, with high probability the ℓ ∞ -discrepancy is at most twice the ℓ ∞ -covering radius of the integer span of the support of the random variable. However, the easy argument for the above fact gives no concrete bounds on the failure probability in terms of n. We prove … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
13
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(13 citation statements)
references
References 11 publications
0
13
0
Order By: Relevance
“…Another result that Ezra and Lovett proved, in [EL15], that if n ≫ m t , then the discrepancy of the random t-regular hypergraph is at most O(1). This has recently been improved due to Franks and Saks [FS18], who prove that as long as n = Ω m 3 log 2 m , the discrepancy of the random t-regular hypergraph is almost surely at most 1.…”
Section: Discrepancy In Random Settingsmentioning
confidence: 99%
See 1 more Smart Citation
“…Another result that Ezra and Lovett proved, in [EL15], that if n ≫ m t , then the discrepancy of the random t-regular hypergraph is at most O(1). This has recently been improved due to Franks and Saks [FS18], who prove that as long as n = Ω m 3 log 2 m , the discrepancy of the random t-regular hypergraph is almost surely at most 1.…”
Section: Discrepancy In Random Settingsmentioning
confidence: 99%
“…The o(1) in the above theorem is O 1 √ log n . One thing that is common to [FS18] and [HR18] is a multivariate local limit theorem on the number of colorings that give discrepancy at most 1, inspired by [KLP12]. Our approach for this is more direct and essentially just amounts to a careful second moment computation of the number of colorings which give discrepancy at most 1, and is the content of Section 4.…”
Section: Discrepancy In Random Settingsmentioning
confidence: 99%
“…We argue that this is an interesting regime for random hypergraphs, as this kind of discrepancy bound is not implied by the Beck-Fiala conjecture. The case where n = Ω(m log m), is of particular interest, and we conclude with a conjecture, building on an open problem (open problem 1) in [FS18]: Conjecture 3.1. There is an absolute constant K > 0 such that the following holds.…”
Section: Conclusion and Discussionmentioning
confidence: 81%
“…However, in this regime, we believe that with high probability, the discrepancy is much lower than √ t (in contrast to λ growing). Recently, Franks and Saks [FS18] showed that for n = Ω(m 3 ), the discrepancy is O(t). We argue that this is an interesting regime for random hypergraphs, as this kind of discrepancy bound is not implied by the Beck-Fiala conjecture.…”
Section: Conclusion and Discussionmentioning
confidence: 99%
“…Ezra and Lovett [19] consider a regular model in which a binary matrix A is chosen uniformly at random conditioned on each column having exactly t ones for some sparsity parameter t. They show when n > m t , then disc(A) = O(1) with high probability. Franks and Saks [21] removed this exponential dependency on t: they showed that if n = Ω m 3 log 2 m , then disc(A) ≤ 2 with high probability. In an independent and simultaneous work, Hoberg and Rothvoß [24] consider A drawn from the (m, n, p)-Bernoulli Ensemble and gave the improved bound that disc(A) ≤ 1 with high probability if n = Ω m 2 log m and mp = Ω(log n).…”
Section: Previous Workmentioning
confidence: 99%