We construct norming meshes for polynomial optimization by the classical Markov inequality on general convex bodies in R d , and by a tangential Markov inequality via an estimate of the Dubiner distance on smooth convex bodies. These allow to compute a (1−ε)-approximation to the minimum of any polynomial of degree not exceeding n by O (n/ √ ε) αd samples, with α = 2 in the general case, and α = 1 in the smooth case. Such constructions are based on three cornerstones of convex geometry, Bieberbach volume inequality and Leichtweiss inequality on the affine breadth eccentricity, and the Rolling Ball Theorem, respectively.