(b) (c) (d) (a)Figure 1: Example images rendered in real time by our method. We achieve near-accurate depth-of-field effects, including lens aberrations (e.g., spherical aberration, (a)). The efficiency of our method makes it well-suited for artistic purposes and we support complex simulations like tilt-shift photography (b). Further, our system offers an intuitive control of depth of field and we extend the physical model (c) to achieve an expressive, yet convincing result (d) (here, the background statues stay focused).
AbstractWe present a novel rendering system for defocus blur and lens effects. It supports physically-based rendering and outperforms previous approaches by involving a novel GPU-based tracing method. Our solution achieves more precision than competing real-time solutions and our results are mostly indistinguishable from offline rendering. Our method is also more general and can integrate advanced simulations, such as simple geometric lens models enabling various lens aberration effects. These latter is crucial for realism, but are often employed in artistic contexts, too. We show that available artistic lenses can be simulated by our method. In this spirit, our work introduces an intuitive control over depth-of-field effects. The physical basis is crucial as a starting point to enable new artistic renderings based on a generalized focal surface to emphasize particular elements in the scene while retaining a realistic look. Our real-time solution provides realistic, as well as plausible expressive results.