Quantified information flow (QIF) has emerged as a rigorous approach to quantitatively measure confidentiality; the information-theoretic underpinning of QIF allows the end-users to link the computed quantities with the computational effort required on the part of the adversary to gain access to desired confidential information. In this work, we focus on the estimation of Shannon entropy for a given program $$\varPi $$
Π
. As a first step, we focus on the case wherein a Boolean formula $$\varphi (X,Y)$$
φ
(
X
,
Y
)
captures the relationship between inputs X and output Y of $$\varPi $$
Π
. Such formulas $$\varphi (X,Y)$$
φ
(
X
,
Y
)
have the property that for every valuation to X, there exists exactly one valuation to Y such that $$\varphi $$
φ
is satisfied. The existing techniques require $$\mathcal {O}(2^m)$$
O
(
2
m
)
model counting queries, where $$m = |Y|$$
m
=
|
Y
|
.We propose the first efficient algorithmic technique, called $$\mathsf {EntropyEstimation}$$
EntropyEstimation
to estimate the Shannon entropy of $$\varphi $$
φ
with PAC-style guarantees, i.e., the computed estimate is guaranteed to lie within a $$(1\pm \varepsilon )$$
(
1
±
ε
)
-factor of the ground truth with confidence at least $$1-\delta $$
1
-
δ
. Furthermore, $$\mathsf {EntropyEstimation}$$
EntropyEstimation
makes only $$\mathcal {O}(\frac{min(m,n)}{\varepsilon ^2})$$
O
(
m
i
n
(
m
,
n
)
ε
2
)
counting and sampling queries, where $$m = |Y|$$
m
=
|
Y
|
, and $$n = |X|$$
n
=
|
X
|
, thereby achieving a significant reduction in the number of model counting queries. We demonstrate the practical efficiency of our algorithmic framework via a detailed experimental evaluation. Our evaluation demonstrates that the proposed framework scales to the formulas beyond the reach of the previously known approaches.