Radiometric compensation methods remove the effect of the underlying spatially varying surface reflectance of the texture when projecting on textured surfaces. All prior work sample the surface reflectance dependent radiometric transfer function from the projector to the camera at every pixel that requires the camera to observe tens or hundreds of images projected by the projector. In this paper, we cast the radiometric compensation problem as a sampling and reconstruction of multi‐dimensional radiometric transfer function that models the color transfer function from the projector to an observing camera and the surface reflectance in a unified manner. Such a multi‐dimensional representation makes no assumption about linearity of the projector to camera color transfer function and can therefore handle projectors with non‐linear color transfer functions(e.g. DLP, LCOS, LED‐based or laser‐based). We show that with a well‐curated sampling of this multi‐dimensional function, achieved by exploiting the following key properties, is adequate for its accurate representation: (a) the spectral reflectance of most real‐world materials are smooth and can be well‐represented using a lower‐dimension function; (b) the reflectance properties of the underlying texture have strong redundancies – for example, multiple pixels or even regions can have similar surface reflectance; (c) the color transfer function from the projector to camera have strong input coherence. The proposed sampling allows us to reduce the number of projected images that needs to be observed by a camera by up to two orders of magnitude, the minimum being only two. We then present a new multi‐dimensional scattered data interpolation technique to reconstruct the radiometric transfer function at a high spatial density (i.e. at every pixel) to compute the compensation image. We show that the accuracy of our interpolation technique is higher than any existing methods.