Tianfan XueGoogle Research(a) View 1 (b) View 2 (c) Anaglyph (d) Aligned (e) Ours Figure 1: Stereo pairs (a, b) were imaged through glass and exhibit undesired reflections. The transmitted and reflective images are subject to parallax that is difficult to separate as shown in the anaglyph (c). Our reflection-invariant flow aligns the two views with respect to the transmitted image, causing all remaining parallax (in the reflection on the tissue box, for example) to be due to reflections as shown in anaglyph (d). Our synthesis network exploits this parallax to remove reflections (e).
Videos captured by consumer cameras often exhibit temporal variations in color and tone that are caused by camera autoadjustments like white-balance and exposure. When such videos are sub-sampled to play fast-forward, as in the increasingly popular forms of timelapse and hyperlapse videos, these temporal variations are exacerbated and appear as visually disturbing high frequency flickering. Previous techniques to photometrically stabilize videos typically rely on computing dense correspondences between video frames, and use these correspondences to remove all color changes in the video sequences. However, this approach is limited in fast-forward videos that often have large content changes and also might exhibit changes in scene illumination that should be preserved. In this work, we propose a novel photometric stabilization algorithm for fast-forward videos that is robust to large content-variation across frames. We compute pairwise color and tone transformations between neighboring frames and smooth these pair-wise transformations while taking in account the possibility of scene/content variations. This allows us to eliminate high-frequency fluctuations, while still adapting to real variations in scene characteristics. We evaluate our technique on a new dataset consisting of controlled synthetic and real videos, and demonstrate that our techniques outperforms the state-of-the-art.
No abstract
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.