Random sampling is a simple but powerful method in statistics and the design of randomized algorithms. In a typical application, random sampling can be applied to estimate an extreme value, say maximum, of a function f over a set S ⊆ R n . To do so, one may select a simpler (even finite) subset S 0 ⊆ S, randomly take some samples over S 0 for a number of times, and pick the best sample. The hope is to find a good approximate solution with reasonable chance. This paper sets out to present a number of scenarios for f , S and S 0 where certain probability bounds can be established, leading to a quality assurance of the procedure. In our setting, f is a multivariate polynomial function. We prove that if f is d-th order homogeneous polynomial in n variables and F is its corresponding super-symmetric tensor, and ξ i (1 ≤ i ≤ n) are i.i.d. Bernoulli random variables taking 1 or −1 with equal probability, then Prob f (ξ 1 , ξ 2 , · · · , ξ n ) ≥ c 1 n − d 2 F 1 ≥ c 2 , where c 1 , c 2 > 0 are two universal constants. Several new inequalities concerning probabilities of the above nature are presented in this paper. Moreover, we show that the bounds are tight in most cases. Applications of our results in optimization are discussed as well.