For the Gaussian sequence model, we obtain non-asymp-totic minimax rates of estimation of the linear, quadratic and the ℓ 2 -norm functionals on classes of sparse vectors and construct optimal estimators that attain these rates. The main object of interest is the class B 0 (s) of s-sparse vectors θ = (θ 1 , . . . , θ d ), for which we also provide completely adaptive estimators (independent of s and of the noise variance σ) having only logarithmically slower rates than the minimax ones. Furthermore, we obtain the minimax rates on. This analysis shows that there are, in general, three zones in the rates of convergence that we call the sparse zone, the dense zone and the degenerate zone, while a fourth zone appears for estimation of the quadratic functional. We show that, as opposed to estimation of θ, the correct logarithmic terms in the optimal rates for the sparse zone scale as log(d/s 2 ) and not as log(d/s). For the class B 0 (s), the rates of estimation of the linear functional and of the ℓ 2 -norm have a simple elbow at s = √ d (boundary between the sparse and the dense zones) and exhibit similar performances, whereas the estimation of the quadratic functional Q(θ) reveals more complex effects and is not possible only on the basis of sparsity described by the condition θ ∈ B 0 (s). Finally, we apply our results on estimation of the ℓ 2 -norm to the problem of testing against sparse alternatives. In particular, we obtain a non-asymptotic analog of Ingster-Donoho-Jin theory revealing some effects that were not captured by the previous asymptotic analysis.