2018
DOI: 10.48550/arxiv.1810.09880
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Empirical Regularized Optimal Transport: Statistical Theory and Applications

Abstract: We derive limit distributions for empirical regularized optimal transport distances between probability distributions supported on a finite metric space and show consistency of the (naive) bootstrap. In particular, we prove that the empirical regularized transport plan itself asymptotically follows a Gaussian law. The theory includes the Boltzmann-Shannon entropy regularization and hence a limit law for the widely applied Sinkhorn divergence. Our approach is based on an application of the implicit function the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(13 citation statements)
references
References 30 publications
0
13
0
Order By: Relevance
“…Inference for the optimal value of an optimization problem (2.2) is generally a hard task, and we focus on finite sample spaces. This simplification is common in the literature on inferential tools for optimal transport problems (Sommerfeld and Munk, 2018;Klatt et al, 2018). As we shall see, the restriction of finite spaces is sufficient for many practical problems, including evaluating the algorithmic fairness of the COMPAS recidivism prediction instrument.…”
Section: The Auditor's Problemmentioning
confidence: 99%
“…Inference for the optimal value of an optimization problem (2.2) is generally a hard task, and we focus on finite sample spaces. This simplification is common in the literature on inferential tools for optimal transport problems (Sommerfeld and Munk, 2018;Klatt et al, 2018). As we shall see, the restriction of finite spaces is sufficient for many practical problems, including evaluating the algorithmic fairness of the COMPAS recidivism prediction instrument.…”
Section: The Auditor's Problemmentioning
confidence: 99%
“…(ii) A central limit theorem characterizing the fluctuations S(P n , Q n ) − ES(P n , Q n ) when P and Q are subgaussian (Section 3). Such a central limit theorem was previously only known for probability measures supported on a finite number of points (Bigot et al, 2017;Klatt et al, 2018). We use completely different techniques, inspired by recent work of Del Barrio and Loubes (2019), to prove our theorem for general subgaussian distributions.…”
Section: Summary Of Contributionsmentioning
confidence: 99%
“…In this section, we accomplish this goal by showing a central limit theorem (CLT) for S(P n , Q n ), valid for any subgaussian measures. Bigot et al (2017) and Klatt et al (2018) have shown CLTs for entropic OT when the measures lie in a finite metric space (or, equivalently, when P and Q are finitely supported). Apart from being restrictive in practice, these results do not shed much light on the general situation because OT on finite metric spaces behaves quite differently from OT on R d .…”
Section: A Central Limit Theorem For Entropic Otmentioning
confidence: 99%
“…In contrast, the confidence intervals derived in the present paper are valid under either no assumptions or mild moment assumptions on P and Q, and are applied more generally to the Sliced Wasserstein distance in arbitrary dimension. Other inferential results for Wasserstein distances include those of Sommerfeld and Munk (2018); Tameling et al (2017); Klatt et al (2018) when the support of P and Q are finite or countable sets, and the work of Rippl et al (2016) when P and Q only differ by a location-scale transformation.…”
Section: Introductionmentioning
confidence: 99%