We study the amount of information that is contained in "random pictures", by which we mean the sample sets of a Boolean model. To quantify the notion "amount of information", two closely connected questions are investigated: on the one hand, we study the probability that a large number of balls is needed for a full reconstruction of a Boolean model sample set. On the other hand, we study the quantization error of the Boolean model w.r.t. the Hausdorff distance as a distortion measure. Keywords: Boolean model; functional quantization; high resolution quantization; information based complexity; metric entropy. 2010 Mathematics Subject Classification: 94A29, 60D05, secondary: 52A22.problem, the following random variable is crucial. Moreover, we believe that it may be of independent interest. Define the effective number of balls visible in the picture asIn other words, with K balls one can reproduce the black picture S exactly as with the original N balls.We are interested in the upper tail of K, i.e. P[K ≥ n] when n → ∞. This means we study the probability that one needs many balls in order to reconstruct the picture S. In particular, we would like to understand when one can "save balls" w.r.t. the original Poisson number of balls N . To make this more precise, note that clearly K ≤ N , and soWe would like to show that the upper tail of K is thinner, i.e. for some a > 1 P[K ≥ n] = exp (−a · n log n · (1 + o(1))) , n → ∞.It turns out that this question is non-trivial and interesting. The answer depends on the dimension d, on the type of norm used, and on the distribution of the radii L(R 1 ). Boolean models are fundamental objects in stochastic geometry and have a large range of applications, [4,17]. However, to the knowledge of the authors, until recently mostly the average of observables of Boolean models are studied. Often this plays a role when estimating parameters of the model in applications. On the contrary, the present paper deals with rare events, i.e. with large deviation probabilities.As mentioned above, the upper tail of the random variable K is an essential ingredient for solving the so-called quantization problem, which we recall now. Let an arbitrary norm ||.|| be fixed on R d . Let d H denote the corresponding Hausdorff distance between the closed subsets of R d . We define the respective quantization error for pictures byHere, the sets C are called codebooks and the upper index (q) stands for "quantization". The idea is that the "analog" signal S should be encoded by an element A ∈ C. This incurs an error, d H (A, S), measured in Hausdorff distance. Losely speaking, D (q) is then the minimal average error over all codebooks C of a size not exceeding e r . We are interested in letting r → ∞, that is, the size of the codebooks grows; and we would like to understand the rate of decay of the corresponding quantization error.