Figure 1: We automatically design and manufacture magic lenses to warp source images into specified target images. Here we photograph a source image (far left) viewed through a manufactured lens with 32×32 facets (left), resulting in four images depending on the lens' orientation atop the source. AbstractWe present an automatic approach to design and manufacture passive display devices based on optical hidden image decoding. Motivated by classical steganography techniques we construct Magic Lenses, composed of refractive lenslet arrays, to reveal hidden images when placed over potentially unstructured printed or displayed source images. We determine the refractive geometry of these surfaces by formulating and efficiently solving an inverse light transport problem, taking into account additional constraints imposed by the physical manufacturing processes. We fabricate several variants on the basic magic lens idea including using a single source image to encode several hidden images which are only revealed when the lens is placed at prescribed orientations on the source image or viewed from different angles. We also present an important special case, the universal lens, that forms an injection mapping from the lens surface to the source image grid, allowing it to be used with arbitrary source images. We use this type of lens to generate hidden animation sequences. We validate our simulation results with many real-world manufactured magic lenses, and experiment with two separate manufacturing processes.
In this paper, we consider rendering color videos using a non-photo-realistic art form technique commonly called stippling. Stippling is the art of rendering images using point sets, possibly with various attributes like sizes, elementary shapes, and colors. Producing nice stippling is attractive not only for the sake of image depiction but also because it yields a compact vectorial format for storing the semantic information of media. Moreover, stippling is by construction easily tunable to various device resolutions without suffering from bitmap sampling artifacts when resizing. The underlying core technique for stippling images is to compute a centroidal Voronoi tessellation on a well-designed underlying density. This density relates to the image content, and is used to compute a weighted Voronoi diagram. By considering videos as image sequences and initializing properly the stippling of one image by the result of its predecessor, one avoids undesirable point flickering artifacts and can produce stippled videos that nevertheless still exhibit noticeable artifacts. To overcome this, our method improves over the naive scheme by considering dynamic point creation and deletion according to the current scene semantic complexity, and show how to effectively vectorize video while adjusting for both color and contrast characteristics. Furthermore, we explain how to produce high quality stippled "videos" (eg., fully dynamic spatio-temporal point sets) for media containing various fading effects, like quick motions of objects or progressive shot changes. We report on practical performances of our implementation, and present several stippled video results rendered on-the-fly using our viewer that allows both spatio-temporal dynamic rescaling (eg., upscale vectorially frame rate).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.