Given two random variables X and Y , an operational approach is undertaken to quantify the "leakage" of information from X to Y . The resulting measure L (X→Y ) is called maximal leakage, and is defined as the multiplicative increase, upon observing Y , of the probability of correctly guessing a randomized function of X, maximized over all such randomized functions. A closed-form expression for L (X→Y ) is given for discrete X and Y , and it is subsequently generalized to handle a large class of random variables. The resulting properties are shown to be consistent with an axiomatic view of a leakage measure, and the definition is shown to be robust to variations in the setup. Moreover, a variant of the Shannon cipher system is studied, in which performance of an encryption scheme is measured using maximal leakage. A single-letter characterization of the optimal limit of (normalized) maximal leakage is derived and asymptotically-optimal encryption schemes are demonstrated. Furthermore, the sample complexity of estimating maximal leakage from data is characterized up to subpolynomial factors. Finally, the guessing framework used to define maximal leakage is used to give operational interpretations of commonly used leakage measures, such as Shannon capacity, maximal correlation, and local differential privacy. arXiv:1807.07878v1 [cs.IT] 20 Jul 2018 3 (R4) It should accord with intuition. That is, it should not mis-characterize the (severity of) information leakage in systems that we understand well. A. Common Information-Theoretic ApproachesNotably, many commonly-used information leakage metrics do not satisfy the above requirements. For example, mutual information, which has been frequently used as a leakage measure [3]-[5][18]- [21], arguably fails to satisfy both (R1) and (R4). Regarding the latter, consider the following example proposed by Smith [22].Example 1: Given n ∈ N, let X = {0, 1} 8n and X ∼ Unif(X ). Now consider the following two conditional distributions:1, otherwise. and P Z|X = (X 1 , X 2 , . . . , X n+1 ).Then the probability of guessing X correctly from Y is at least 1/8, whereas the probability of guessing X correctly from Z is only 2 −7n+1 for Z. However, one can readily verify that I(X; Y ) ≈ (n + 0.169) log 2 ≤ I(X; Z) = (n + 1) log 2 [22].Regarding the former, note that operational interpretations of mutual information arise in transmission and compression settings, which are different from the security setting at hand. Moreover, in those settings, mutual information arises as part of a computable characterization of the solution, rather than as part of the formulation itself, i.e., the transmission and compression problems are not defined in terms of mutual information.Mutual information could potentially be justified by appealing to rate-distortion theory [23, Section V]. In fact, a number of leakage measures in the literature are based on rate-distortion theory. For instance, Yamomoto [24] introduces a distortion function d and measures the privacy of P Y |X using infx (·) E[d(X,x(Y ))]. ...
We consider the following problem: Alice and Bob observe sequences X n and Y n respectively where {(Xi, Yi)} ∞ i=1 are drawn i.i.d. from P (x, y), and they output U and V respectively which is required to have a joint law that is close in total variation to a specified Q(u, v). One important technique to establish impossibility results for this problem is the Hirschfeld-Gebelein-Rényi maximal correlation which was considered by Witsenhausen [1]. Hypercontractivity studied by Ahlswede and Gács [2] and reverse hypercontractivity recently studied by Mossel et al.[3] provide another approach for proving impossibility results. We consider the tightest impossibility results that can be obtained using hypercontractivity and reverse hypercontractivity and provide a necessary and sufficient condition on the source distribution P (x, y) for when this approach subsumes the maximal correlation approach. We show that the binary pair source distribution with symmetric noise satisfies this condition.
We describe a simple improvement over the Network Sharing outer bound [1] for the multiple unicast problem. We call this the Generalized Network Sharing (GNS) outer bound. We note two properties of this bound with regard to the two-unicast problem: a) it is the tightest bound that can be realized using only edge-cut bounds and b) it is tight in the special case when all edges except those from a so-called minimal GNS set have sufficiently large capacities. Finally, we present an example showing that the GNS outer bound is not tight for the two-unicast problem.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.