We present a new pipeline to enable head-motion parallax in omnidirectional stereo (ODS) panorama video rendering using a neural depth decoder. While recent ODS panorama cameras record short-baseline horizontal stereo parallax to offer the impression of binocular depth, they do not support the necessary translational degrees-of-freedom (DoF) to also provide for head-motion parallax in virtual reality (VR) applications. To overcome this limitation, we propose a pipeline that enhances the classical ODS panorama format with 6 DoF free-viewpoint rendering by decomposing the scene into a multi-layer mesh representation. Given a spherical stereo panorama video, we use the horizontal disparity to store explicit depth information for both eyes in a simple neural decoder architecture. While this approach produces reasonable results for individual frames, video rendering usually suffers from temporal depth inconsistencies. Thus, we perform successive optimization to improve temporal consistency by fine-tuning our depth decoder for both temporal and spatial smoothness. Using a consumer-grade ODS camera, we evaluate our approach on a number of real-world scene recordings and demonstrate the versatility and robustness of the proposed pipeline.
Figure 1. Our Neural -Shape from Caustics (N-SfC) framework estimates the shape of a glass substrate from a single observation of the resulting caustic image (far left). We extend existing work based on physically based light transport simulation (middle left) with learned denoising and gradient descent (middle right). Multispectral simulation and dispersion effects are shown on the far right.
Figure 1: We create a 6-DoF VR experience from a single omnidirectional stereo (ODS) pair of a scene. Our approach takes an ODS panorama as input (1), along with the radius of the viewing circle. We determine disparities between the left and right eye views using optical flow (2). The disparities (3) are used to obtain depth per pixel (4) creating a pointcloud (5) used to generate a DASP.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.