2014
DOI: 10.1007/s00440-014-0591-7
|View full text |Cite
|
Sign up to set email alerts
|

Sublinear variance in first-passage percolation for general distributions

Abstract: We prove that the variance of the passage time from the origin to a point x in first-passage percolation on Z d is sublinear in the distance to x when d ≥ 2, obeying the bound C x / log x , under minimal assumptions on the edge-weight distribution. The proof applies equally to absolutely continuous, discrete and singular continuous distributions and mixtures thereof, and requires only 2+log moments. The main result extends work of Benjamini-Kalai-Schramm [4] and Benaim-Rossignol [6].

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
55
0
1

Year Published

2014
2014
2024
2024

Publication Types

Select...
8
1
1

Relationship

2
8

Authors

Journals

citations
Cited by 39 publications
(57 citation statements)
references
References 36 publications
1
55
0
1
Order By: Relevance
“…In [22], it is shown that, if F has finite second moment, then c 1 ≤ Var (T (0, n)) ≤ c 2 n. As for upper bounds, the best bound to date is Var(T (0, n)) = O(n/ log n), which is proved for uniform distributions in [5], for a larger class of (non-atomic) distributions in [4] and for general distributions in [11]. As for lower bounds, the best result is that Var (T (0, n)) is at least of the order log n; see [1,29,30,36].…”
Section: Asymptotic Shapementioning
confidence: 97%
“…In [22], it is shown that, if F has finite second moment, then c 1 ≤ Var (T (0, n)) ≤ c 2 n. As for upper bounds, the best bound to date is Var(T (0, n)) = O(n/ log n), which is proved for uniform distributions in [5], for a larger class of (non-atomic) distributions in [4] and for general distributions in [11]. As for lower bounds, the best result is that Var (T (0, n)) is at least of the order log n; see [1,29,30,36].…”
Section: Asymptotic Shapementioning
confidence: 97%
“…This theorem was generalized by Damron, Hanson, and Sosoe [8] under weak assumptions on μ. This is a good point to demonstrate a simple but powerful idea developed by Benjamini, Kalai, and Schramm.…”
Section: Theorem 13 ([4 Theorem 1]) There Is a Constantmentioning
confidence: 98%
“…Within each unit box B, a Poisson random variable determines how many points are in the box, and uniform random variables determine the positions of the points. Following the techniques in [DHS15], we will encode the Poisson random variable through an infinite sequence of Bernoulli random variables, described below. Once this is done, we can apply logarithmic Sobolev inequalities to argue that the sum of the entropy, i Ent V 2 i , is bounded by the sum of derivatives with respect to these uniform and Bernoulli random variables, as described in Lemma 3.2 below.…”
Section: Bounding Entropymentioning
confidence: 99%