In this paper, we address the problem of estimating a multidimensional density f by using indirect observations from the statistical model Y = X + ε. Here, ε is a measurement error independent of the random vector X of interest, and having a known density with respect to the Lebesgue measure. Our aim is to obtain optimal accuracy of estimation under Lp-losses when the error ε has a characteristic function with a polynomial decay. To achieve this goal, we first construct a kernel estimator of f which is fully data driven. Then, we derive for it an oracle inequality under very mild assumptions on the characteristic function of the error ε. As a consequence, we get minimax adaptive upper bounds over a large scale of anisotropic Nikolskii classes and we prove that our estimator is asymptotically rate optimal when p ∈ [2, +∞]. Furthermore, our estimation procedure adapts automatically to the possible independence structure of f and this allows us to improve significantly the accuracy of estimation.