For domains in which fitness is subjective or difficult to express formally, interactive evolutionary computation (IEC) is a natural choice. It is possible that a collaborative process combining feedback from multiple users can improve the quality and quantity of generated artifacts. Picbreeder, a large-scale online experiment in collaborative interactive evolution (CIE), explores this potential. Picbreeder is an online community in which users can evolve and share images, and most importantly, continue evolving others' images. Through this process of branching from other images, and through continually increasing image complexity made possible by the underlying neuroevolution of augmenting topologies (NEAT) algorithm, evolved images proliferate unlike in any other current IEC system. This paper discusses not only the strengths of the Picbreeder approach, but its challenges and shortcomings as well, in the hope that lessons learned will inform the design of future CIE systems.
Picbreeder is an online service that allows users to collaboratively evolve images. Like in other Interactive Evolutionary Computation (IEC) programs, users evolve images in Picbreeder by selecting ones that appeal to them to produce a new generation. However, Picbreeder also offers an online community in which to share these images, and most importantly, the ability to continue evolving others' images. Through this process of branching from other images, and through continually increasing image complexity made possible by the NeuroEvolution of Augmenting Topologies (NEAT) algorithm, evolved images proliferate unlike in any other current IEC systems. Picbreeder enables all users, regardless of talent, to participate in a creative, exploratory process. This paper details how Picbreeder encourages innovation, featuring images that were collaboratively evolved.
In Mixed Reality (MR) applications, immersion of virtual objects in captured video contributes to the perceived unification of two worlds, one real, one synthetic. Since virtual actors and surround may appear both closer and farther than real objects, compositing must consider spatial relationships in the resulting world. Chroma keying, often called blue screening or green screening, is one common solution to this problem. This method is under-constrained and most commonly addressed through a combination of environment preparation and commercial products. In interactive MR domains that impose restrictions on the video camera hardware, such as in experiences using video see-through (VST) head-mounted displays (HMD), chroma keying becomes even more difficult due to the relatively low camera quality, the use of multiple camera sources (one per eye), and the required processing speed. Dealing with these constraints requires a fast and affordable solution. In our approach, we precondition the chroma key by using principal component analysis (PCA) to obtain usable alpha mattes from video streams in real-time on commodity graphics processing units (GPUs). In addition, we demonstrate how our method compares to off-line commercial keying tools and how it performs with respect to signal noise within the video stream.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.