Image coloring is a time-consuming and laborious work. For a work, color collocation is an important factor to determine its quality. Therefore, automatic image coloring is a topic with great research significance and application value. With the development of computer hardware, deep learning technology has achieved satisfactory results in the field of automatic coloring. According to the source of color information, this paper can divide automatic coloring methods into three types: image coloring based on prior knowledge, image coloring based on reference pictures, and interactive coloring. The coloring method can meet the needs of most users, but there are disadvantages such as users cannot get the multiple objects in a picture of different reference graph coloring. Aiming at this problem, based on the instance of color image segmentation and image fusion technology, the use of deep learning is proposed to implement regional mixed color more and master the method. It can be divided into foreground color based on reference picture and background color based on prior knowledge. In order to identify multiple objects and background areas in the image and fuse the final coloring results together, a method of image coloring based on CNN is proposed in this paper. Firstly, CNN is used to extract their semantic information, respectively. According to the extractive semantic information, the color of the designated area of the reference image is transferred to the designated area of the grayscale image. During the transformation, images combined with semantic information are input into CNN model to obtain the content feature map of grayscale image and the style feature map of reference image. Then, a random noise map is iterated to make the noise map approach the content feature map as a whole and the specific target region approach the designated area of the style feature map. Experimental results show that the proposed method has good effect on image coloring and has great advantages in network volume and coloring effect.