It is well known that the entropy H(X) of a discrete random variable X is always greater than or equal to the entropy H(f (X)) of a function f of X, with equality if and only if f is one-to-one. In this paper, we give tight bounds on H(f (X)) when the function f is not one-to-one, and we illustrate a few scenarios where this matters. As an intermediate step towards our main result, we derive a lower bound on the entropy of a probability distribution, when only a bound on the ratio between the maximal and minimal probabilities is known. The lower bound improves on previous results in the literature, and it could find applications outside the present scenario.I. THE PROBLEM Let X = {x 1 , . . . , x n } be a finite alphabet, and X be any random variable (r.v.) taking values in X according to the probability distribution p = (p 1 , p 2 , . . . , p n ), that is, such that P {X = x i } = p i , for i = 1, 2, . . . , n. A well known and widely used inequality states thatwhere f : X → Y is any function defined on X , and H(·) denotes the Shannon entropy. Moreover, equality holds in (1) if and only if the function f is one-to-one. The main purpose of this paper is to sharpen inequality (1) by deriving tight bounds on H(f (X)) when f is not one-to-one. More precisely, given the r.v. X, an integer 2 ≤ m < n, a set Y m = {y 1 , . . . , y m }, and the family of surjective functions