We present a novel hybrid rendering method for diffuse and glossy indirect illumination. A scene is rendered using standard rasterization on a GPU. In a shader, secondary ray queries are used to sample incident light and to compute indirect lighting. We observe that it is more important to cast many rays than to have precise results for each ray. Thus, we approximate secondary rays by intersecting them with precomputed layered depth images of the scene. We achieve interactive to real-time frame rates including indirect diffuse and glossy effects.
Figure 1: Reproducing fine surface detail using our method for signal optimal displacement mapping. From left to right: Augustus model and close-ups of the full resolution model (366 MB GPU memory, 19.3 ms rendering time), and our reconstructions with fitting errors εmax = 0.1 (90 MB, 11.9 ms) and εmax = 0.5 (28 MB, 3.7 ms), respectively. AbstractWe present a novel representation for storing sub-triangle signals, such as colors, normals, or displacements directly with the triangle mesh. Signal samples are stored as guided by hardware-tessellation patterns. Thus, we can directly render from our representation by assigning signal samples to attributes of vertices generated by the hardware tessellator.Contrary to texture mapping, our approach does not require any atlas generation, chartification, or uv-unwrapping. Thus, it does not suffer from texture-related artifacts, such as discontinuities across chart boundaries or distortion. Moreover, our approach allows specifying the optimal sampling rate adaptively on a per triangle basis, resulting in significant memory savings for most signal types.We propose a signal optimal approach for converting arbitrary signals, including existing assets with textures or mesh colors, into our representation. Further, we provide efficient algorithms for mipmapping, bi-and tri-linear interpolation directly in our representation. Our approach is optimally suited for displacement mapping: it automatically generates crack-free, view-dependent displacement mapped models enabling continuous level-of-detail.
Graphics hardware has progressively been optimized to render more triangles with increasingly flexible shading. For highly detailed geometry, interactive applications restricted themselves to performing transforms on fixed geometry, since they could not incur the cost required to generate and transfer smooth or displaced geometry to the GPU at render time. As a result of recent advances in graphics hardware, in particular the GPU tessellation unit, complex geometry can now be generated on the fly within the GPU's rendering pipeline. This has enabled the generation and displacement of smooth parametric surfaces in real-time applications. However, many well-established approaches in offline rendering are not directly transferable due to the limited tessellation patterns or the parallel execution model of the tessellation stage. In this survey, we provide an overview of recent work and challenges in this topic by summarizing, discussing, and comparing methods for the rendering of smooth and highly detailed surfaces in real time.
No abstract
Figure 1: Local displacement such as the character footprints (left) or sculpting brushes (center) including vector displacement (right) are applied to arbitrary geometry using hardware tessellation patterns. Memory for displacement is dynamically allocated on the GPU. The total time to apply a deformation is less than a millisecond, even for complex models and scenes. AbstractWe propose a novel method for local displacement events in large scenes, such as scratches, footsteps, or sculpting operations. Deformations are stored as displacements for vertices generated by hardware tessellation. Adaptive mesh refinement, application of the displacement and all involved memory management happen completely on the GPU. We show various extensions to our approach, such as on-the-fly normal computation and multi-resolution editing. In typical game scenes we perform local deformations at arbitrary positions in far less than one millisecond. This makes the method particularly suited for games and interactive sculpting applications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.