2017
DOI: 10.48550/arxiv.1711.08947
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Central limit theorems for entropy-regularized optimal transport on finite spaces and statistical applications

Abstract: The notion of entropy-regularized optimal transport, also known as Sinkhorn divergence, has recently gained popularity in machine learning and statistics, as it makes feasible the use of smoothed optimal transportation distances for data analysis. The Sinkhorn divergence allows the fast computation of an entropically regularized Wasserstein distance between two probability distributions supported on a finite metric space of (possibly) high-dimension. For data sampled from one or two unknown probability distrib… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
21
0

Year Published

2018
2018
2019
2019

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 12 publications
(22 citation statements)
references
References 25 publications
1
21
0
Order By: Relevance
“…It includes the widely applied entropy ROT in (1.4) and hence limit distributions for the empirical entropy ROT plan and, as a consequence, for the empirical Sinkhorn divergence. For the latter see also Bigot et al (2017) who obtained limit distributions in a similar fashion to (1.8) for the optimal value of (1.3) with a straightforward application of the technique in . Note that the technique we introduce here allows to treat the ROT plan itself also in notable distinction to λ = 0, where such a result as in (1.7) is not known.…”
Section: Introductionmentioning
confidence: 83%
See 1 more Smart Citation
“…It includes the widely applied entropy ROT in (1.4) and hence limit distributions for the empirical entropy ROT plan and, as a consequence, for the empirical Sinkhorn divergence. For the latter see also Bigot et al (2017) who obtained limit distributions in a similar fashion to (1.8) for the optimal value of (1.3) with a straightforward application of the technique in . Note that the technique we introduce here allows to treat the ROT plan itself also in notable distinction to λ = 0, where such a result as in (1.7) is not known.…”
Section: Introductionmentioning
confidence: 83%
“…as then ∇S λ (r, r) = 0 (see also Bigot et al (2017)). However, a second order expansion which is based on a perturbation analysis for the dual solutions provides a non degenerate asymptotic limit of nS λ (r n , r).…”
Section: Distributional Limitsmentioning
confidence: 98%
“…(ii) A central limit theorem characterizing the fluctuations S(P n , Q n ) − ES(P n , Q n ) when P and Q are subgaussian (Section 3). Such a central limit theorem was previously only known for probability measures supported on a finite number of points (Bigot et al, 2017;Klatt et al, 2018). We use completely different techniques, inspired by recent work of Del Barrio and Loubes (2019), to prove our theorem for general subgaussian distributions.…”
Section: Summary Of Contributionsmentioning
confidence: 99%
“…In this section, we accomplish this goal by showing a central limit theorem (CLT) for S(P n , Q n ), valid for any subgaussian measures. Bigot et al (2017) and Klatt et al (2018) have shown CLTs for entropic OT when the measures lie in a finite metric space (or, equivalently, when P and Q are finitely supported). Apart from being restrictive in practice, these results do not shed much light on the general situation because OT on finite metric spaces behaves quite differently from OT on R d .…”
Section: A Central Limit Theorem For Entropic Otmentioning
confidence: 99%
“…The asymptotic behavior of empirical Wasserstein distances when both µ and ν are absolutely continuous measures has been extensively studied over the last years [15,16,17,20,37]. For probability measures supported on finite spaces, limiting distributions for empirical Wasserstein distance have been obtained in [41], while the asymptotic distribution of empirical Sinkhorn divergence has been recently considered in [8,29]. 1.3.…”
mentioning
confidence: 99%