We propose a set of dynamic shading enhancement techniques for improving the perception of details, features, and overall shape characteristics from images created with Reflectance Transformation Imaging (RTI) techniques. Selection of these perceptual enhancement filters can significantly improve the user's ability to interactively inspect the content of 2D RTI media by zooming, panning, and changing the illumination direction. In particular, we present two groups of strategies for RTI image enhancement based on two main ideas: exploiting the unsharp masking methodology in the RTI-specific context; and locally optimizing the incident light direction for improved RTI image sharpness and illumination of surface features. The Result section will present a number of datasets and compare them with existing techniques.
ACM Reference Format:Palma, G., Corsini, M., Cignoni, P., Scopigno, R., and Mudge, M. 2010. Dynamic shading enhancement for reflectance transformation imaging. ACM J.
We present a statistical method for the estimation of the Spatially Varying Bidirectional Reflectance Distribution Function (SVBRDF) of an object with complex geometry, starting from video sequences acquired with fixed but general lighting conditions. The aim of this work is to define a method that simplifies the acquisition phase of the object surface appearance and allows to reconstruct an approximated SVBRDF. The final output is suitable to be used with a 3D model of the object to obtain accurate and photo-realistic renderings. The method is composed by three steps: the approximation of the environment map of the acquisition scene, using the same object as a probe; the estimation of the diffuse color of the object; the estimation of the specular components of the main materials of the object, by using a Phong model. All the steps are based on statistical analysis of the color samples projected by the video sequences on the surface of the object. Although the method presents some limitations, the trade-off between the easiness of acquisition and the obtained results makes it useful for practical applications.
Detecting geometric changes between two 3D captures of the same location performed at different moments is a critical operation for all systems requiring a precise segmentation between change and no‐change regions. Such application scenarios include 3D surface reconstruction, environment monitoring, natural events management and forensic science. Unfortunately, typical 3D scanning setups cannot provide any one‐to‐one mapping between measured samples in static regions: in particular, both extrinsic and intrinsic sensor parameters may vary over time while sensor noise and outliers additionally corrupt the data. In this paper, we adopt a multi‐scale approach to robustly tackle these issues. Starting from two point clouds, we first remove outliers using a probabilistic operator. Then, we detect the actual change using the implicit surface defined by the point clouds under a Growing Least Square reconstruction that, compared to the classical proximity measure, offers a more robust change/no‐change characterization near the temporal intersection of the scans and in the areas exhibiting different sampling density and direction. The resulting classification is enhanced with a spatial reasoning step to solve critical geometric configurations that are common in man‐made environments. We validate our approach on a synthetic test case and on a collection of real data sets acquired using commodity hardware. Finally, we show how 3D reconstruction benefits from the resulting precise change/no‐change segmentation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.