2014
DOI: 10.1007/978-3-319-04696-9_1
|View full text |Cite
|
Sign up to set email alerts
|

Upper Bounds in Classical Discrepancy Theory

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2014
2014
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 27 publications
0
5
0
Order By: Relevance
“…For envy-freeness, we consider a similar additive approximation, EF-r, where an agent's utility must not increase by more than r when swapping places with another agent. We make a connection to discrepancy theory (Chen et al 2014) to show that a EF-O( n k • log k) partition always exists, and it can be computed efficiently. We conjecture that a EF-2 partition may always exists for any k.…”
Section: Our Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…For envy-freeness, we consider a similar additive approximation, EF-r, where an agent's utility must not increase by more than r when swapping places with another agent. We make a connection to discrepancy theory (Chen et al 2014) to show that a EF-O( n k • log k) partition always exists, and it can be computed efficiently. We conjecture that a EF-2 partition may always exists for any k.…”
Section: Our Resultsmentioning
confidence: 99%
“…Instead, our focus is on providing worst-case guarantees on the necessary violation of envy-freeness, as is commonly done in the literature on fair resource allocation (Lipton et al 2004;Caragiannis et al 2019;Aziz et al 2019). We make a connection to discrepancy theory (Chen et al 2014) to establish an O( √ n) bound. In discrepancy theory, the goal is to distribute each agent's friends as evenly as possible between the parts, so that not only does an agent not have many more friends in another part than her own part, she also does not have many more friends in her own part than in any other part.…”
Section: Related Workmentioning
confidence: 99%
“…Consider a number of points N R from a sequence {θ i }, for i = 1, .., N , in an n-dimensional rectangle R centred upon an origin 0, whose sides are parallel to the coordinate axis, which is a subset of I n : R ⊂ I n , where R is attached with a measure. A sequence has low discrepancy if the proportion of points in the sequence falling into an arbitrary set R is close to the measure of R. LDS satisfies the upper bound condition [33]:…”
Section: Quasi Monte-carlo Sampling Methodsmentioning
confidence: 99%
“…A sequence has low discrepancy if the proportion of points in the sequence falling into an arbitrary set R is close to the measure of R . LDS satisfies the upper bound condition [33]: where D N is the sample discrepancy and k ( n ) is a particular constant depending on the sequence and size of input paraeter space. LDS is designed to place sample points as uniformly as possible mathematically, within a hypercube, instead of the statistical approach adopted in LH.…”
Section: Methodsmentioning
confidence: 99%
“…A sequence has low discrepancy if the proportion of points in the sequence falling into an arbitrary set R is close to the measure of R . LDS satisfies the upper bound condition [ 33 ]: where D N is the sample discrepancy and k ( n ) is a particular constant depending on the sequence and size of input paraeter space. LDS is designed to place sample points as uniformly as possible mathematically, within a hypercube, instead of the statistical approach adopted in LH.…”
Section: Methodsmentioning
confidence: 99%