Figure 1 An image analogy. Our problem is to compute a new "analogous" image B that relates to B in "the same way" as A relates to A. Here, A, A , and B are inputs to our algorithm, and B is the output. The full-size images are shown in Figures 10 and 11.
AbstractThis paper describes a new framework for processing images by example, called "image analogies." The framework involves two stages: a design phase, in which a pair of images, with one image purported to be a "filtered" version of the other, is presented as "training data"; and an application phase, in which the learned filter is applied to some new target image in order to create an "analogous" filtered result. Image analogies are based on a simple multiscale autoregression, inspired primarily by recent results in texture synthesis. By choosing different types of source image pairs as input, the framework supports a wide variety of "image filter" effects, including traditional image filters, such as blurring or embossing; improved texture synthesis, in which some textures are synthesized with higher quality than by previous approaches; super-resolution, in which a higher-resolution image is inferred from a low-resolution source; texture transfer, in which images are "texturized" with some arbitrary source texture; artistic filters, in which various drawing and painting styles are synthesized based on scanned real-world examples; and texture-by-numbers, in which realistic scenes, composed of a variety of textures, are created using a simple painting interface.Please see http://grail.cs.washington.edu/projects/image-analogies/ for additional information and results.While image analogies are clearly a desirable goal, it is not so clear how they might be achieved.