Color structure of a home scene image closely relates to the material properties of its local regions. Existing color migration methods typically fail to fully infer the correlation between the coloring of local home scene regions, leading to a local blur problem. In this paper, we propose a color migration framework for home scene images. It picks the coloring from a template image and transforms such coloring to a home scene image through a simple interaction. Our framework comprises three main parts. First, we carry out an interactive segmentation to divide an image into local regions and extract their corresponding colors. Second, we generate a matching color table by sampling the template image according to the color structure of the original home scene image. Finally, we transform colors from the matching color table to the target home scene image with the boundary transition maintained. Experimental results show that our method can effectively transform the coloring of a scene matching with the color composition of a given natural or interior scenery.
Indoor home scene coloring technology is a hot topic for home design, helping users make home coloring decisions. Image based home scene coloring is preferable for e-commerce customers since it only requires users to describe coloring expectations or manipulate colors through images, which is intuitive and inexpensive. In contrast, if home scene coloring is performed based on 3D scenes, the process becomes expensive due to the high cost and time in obtaining 3D models and constructing 3D scenes. To realize image based home scene coloring, our framework can extract the coloring of individual furniture together with their relationship. This allows us to formulate the color structure of the home scene, serving as the basis for color migration. Our work is challenging since it is not intuitive to identify the coloring of furniture and their parts as well as the coloring relationship among furniture. This paper presents a new color migration framework for home scenes. We first extract local coloring from a home scene image forming a regional color table. We then generate a matching color table from a template image based on its color structure. Finally we transform the target image coloring based on the matching color table and well maintain the boundary transitions among image regions. We also introduce an interactive operation to guide such transformation. Experiments show our framework can produce good results meeting human visual expectations. CCS CONCEPTS • Human-centered computing → Scenario-based design; Interaction design theory, concepts and paradigms;
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.