One of the key ingredients of any physically based rendering system is a detailed specification characterizing the interaction of light and matter of all materials present in a scene, typically via the Bidirectional Reflectance Distribution Function (BRDF). Despite their utility, access to real-world BRDF datasets remains limited: this is because measurements involve scanning a four-dimensional domain at sufficient resolution, a tedious and often infeasibly time-consuming process. We propose a new parameterization that automatically adapts to the behavior of a material, warping the underlying 4D domain so that most of the volume maps to regions where the BRDF takes on non-negligible values, while irrelevant regions are strongly compressed. This adaptation only requires a brief 1D or 2D measurement of the material's retro-reflective properties. Our parameterization is unified in the sense that it combines several steps that previously required intermediate data conversions: the same mapping can simultaneously be used for BRDF acquisition, storage, and it supports efficient Monte Carlo sample generation. We observe that the above desiderata are satisfied by a core operation present in modern rendering systems, which maps uniform variates to direction samples that are proportional to an analytic BRDF. Based on this insight, we define our adaptive parameterization as an invertible, retro-reflectively driven mapping between the parametric and directional domains. We are able to create noise-free renderings of existing BRDF datasets after conversion into our representation with the added benefit that the warped data is significantly more compact, requiring 16KiB and 544KiB per spectral channel for isotropic and anisotropic specimens, respectively. Finally, we show how to modify an existing gonio-photometer to provide the needed retro-reflection measurements. Acquisition then proceeds within a 4D space that is warped by our parameterization. We demonstrate the efficacy of this scheme by acquiring the first set of spectral BRDFs of surfaces exhibiting arbitrary roughness, including anisotropy.
Figure 1: Top-left: rendering a voxelized forest at decreasing levels of detail (left to right). Bottom-right: visualization of the voxel structure at the matching resolutions. We use the SGGX microflake distribution to represent volumetric anisotropic materials. Our representation supports downscaling and interpolation, resulting in smooth and antialiased renderings at multiple scales. AbstractWe introduce the Symmetric GGX (SGGX) distribution to represent spatially-varying properties of anisotropic microflake participating media. Our key theoretical insight is to represent a microflake distribution by the projected area of the microflakes. We use the projected area to parameterize the shape of an ellipsoid, from which we recover a distribution of normals. The representation based on the projected area allows for robust linear interpolation and prefiltering, and thanks to its geometric interpretation, we derive closed form expressions for all operations used in the microflake framework.We also incorporate microflakes with diffuse reflectance in our theoretical framework. This allows us to model the appearance of rough diffuse materials in addition to rough specular materials. Finally, we use the idea of sampling the distribution of visible normals to design a perfect importance sampling technique for our SGGX microflake phase functions. It is analytic, deterministic, simple to implement, and one order of magnitude faster than previous work.
Figure 1: A high-quality animated production model (Ptex T-rex model c Walt Disney Animation Studios.) rendered in real time under directional and environment lighting using LEADR mapping on an NVidia GTX 480 GPU. The surface appearance is preserved at all scales, using a single shading sample per pixel. Combined with adaptive GPU tessellation, our method provides the fastest, seamless, and antialiased progressive representation for displaced surfaces. AbstractWe present Linear Efficient Antialiased Displacement and Reflectance (LEADR) mapping, a reflectance filtering technique for displacement mapped surfaces. Similarly to LEAN mapping, it employs two mipmapped texture maps, which store the first two moments of the displacement gradients. During rendering, the projection of this data over a pixel is used to compute a noncentered anisotropic Beckmann distribution using only simple, linear filtering operations. The distribution is then injected in a new, physically based, rough surface microfacet BRDF model, that includes masking and shadowing effects for both diffuse and specular reflection under directional, point, and environment lighting. Furthermore, our method is compatible with animation and deformation, making it extremely general and flexible. Combined with an adaptive meshing scheme, LEADR mapping provides the very first seamless and hardware-accelerated multi-resolution representation for surfaces. In order to demonstrate its effectiveness, we render highly detailed production models in real time on a commodity GPU, with quality matching supersampled ground-truth images.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.