Figure 1: This candlelit setting from the wine cave of a castle is difficult to photograph due to its low light nature. A flash image captures the high-frequency texture and detail, but changes the overall scene appearance to cold and gray. The no-flash image captures the overall appearance of the warm candlelight, but is very noisy. We use the detail information from the flash image to both reduce noise in the no-flash image and sharpen its detail. Note the smooth appearance of the brown leather sofa and crisp detail of the bottles. For full-sized images, please see the supplemental DVD or the project website http://research.microsoft.com/projects/FlashNoFlash. AbstractDigital photography has made it possible to quickly and easily take a pair of images of low-light environments: one with flash to capture detail and one without flash to capture ambient illumination. We present a variety of applications that analyze and combine the strengths of such flash/no-flash image pairs. Our applications include denoising and detail transfer (to merge the ambient qualities of the no-flash image with the high-frequency flash detail), white-balancing (to change the color tone of the ambient image), continuous flash (to interactively adjust flash intensity), and red-eye removal (to repair artifacts in the flash image). We demonstrate how these applications can synthesize new images that are of higher quality than either of the originals.
Figure 1: This candlelit setting from the wine cave of a castle is difficult to photograph due to its low light nature. A flash image captures the high-frequency texture and detail, but changes the overall scene appearance to cold and gray. The no-flash image captures the overall appearance of the warm candlelight, but is very noisy. We use the detail information from the flash image to both reduce noise in the no-flash image and sharpen its detail. Note the smooth appearance of the brown leather sofa and crisp detail of the bottles. For full-sized images, please see the supplemental DVD or the project website http://research.microsoft.com/projects/FlashNoFlash. AbstractDigital photography has made it possible to quickly and easily take a pair of images of low-light environments: one with flash to capture detail and one without flash to capture ambient illumination. We present a variety of applications that analyze and combine the strengths of such flash/no-flash image pairs. Our applications include denoising and detail transfer (to merge the ambient qualities of the no-flash image with the high-frequency flash detail), white-balancing (to change the color tone of the ambient image), continuous flash (to interactively adjust flash intensity), and red-eye removal (to repair artifacts in the flash image). We demonstrate how these applications can synthesize new images that are of higher quality than either of the originals.
Despite the ubiquity of slide presentations, managing multiple presentations remains a challenge. Understanding how multiple versions of a presentation are related to one another, assembling new presentations from existing presentations, and collaborating to create and edit presentations are difficult tasks. In this paper, we explore techniques for comparing and managing multiple slide presentations. We propose a general comparison framework for computing similarities and differences between slides. Based on this framework we develop an interactive tool for visually comparing multiple presentations. The interactive visualization facilitates understanding how presentations have evolved over time. We show how the interactive tool can be used to assemble new presentations from a collection of older ones and to merge changes from multiple presentation authors. INTRODUCTIONSlide presentations have become a ubiquitous means of sharing information. In 2001, Microsoft estimated that at least 30 million PowerPoint presentations were created every day [19]. Knowledge workers often maintain collections of hundreds of presentations [3]. Moreover, it is common to create multiple versions of a presentation, adapting it as necessary to the audience or to other presentation constraints. One version may be designed as a 20 minute conference presentation for researchers, while another version may be designed as an hour long class for undergraduate students. Each version contains different aspects of the content. A common approach to building a new presentation is to study the collection of older versions and then assemble together the appropriate pieces from the collection. Similarly, when collaborating with others on creating a presentation, the collaborators will often start from a common template, then separately fill in sections on their own and finally assemble the different versions together. Yet, current presentation creation tools [1,12,24] provide little support for working with multiple versions of a presentation simultaneously. The result is that assembling a new presentation from older versions can be very tedious.In this paper we present new techniques and tools for visually comparing and managing multiple versions of slide presentations. Our work makes three main contributions: Comparison framework: We develop a framework for comparing presentations to identify the subsets of slides that are similar across each version. There are a number of ways to measure similarity between presentations, including pixel-level image differences between slides, differences between the text on each slide, etc. We propose several such distance measures and discuss how they reveal the underlying similarities and differences between presentations. Interactive visualization: We provide an interactive tool for viewing multiple versions of a presentation. Users can examine differences between presentations along any of the distance measures computed by our comparison framework. The visualization is designed to help users understand how the...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.