This study presents a deep-learning model, using the Conditional Generative Adversarial Nets (CGAN) technique, that can produce daytime visible (VIS) band information, mimicking a narrow band sensor, by combining VIS and infrared (IR) broadband measurements by different sensors. The real-observed datasets of the Geostationary Ocean Color Imager (GOCI) and Meteoritical Imager (MI) sensors onboard the Communication, Ocean, and Meteorological Satellite were used for training and testing our CGAN-model over the Yellow Sea and Bohai Sea. The trained and tested CGAN model was then applied to generate daytime GOCI VIS and near IR (NIR) bands (0.412 μm to 0.865 μm) using daytime MI VIS (0.675 μm), shortwave IR (3.75 μm), and longwave IR bands (10.8, and 12.0 μm) and the differences between them as input data. GOCI and MI data were collected from January 2017 to December 2018 using 705 images of 256×256 pixels for the training and 44 images for the model test. The results are statistically favorable (i.e., bias = −0.013 (in a reflectance unit from 0 to 1), root-mean-square-error = 0.112, mean absolute error = 0.076, agreement index = 0.945, and correlation coefficient (CC) = 0.809 for daytime reflectance in the GOCI VIS 0.49 μm band) between the real GOCI VIS band observation and the CGANgenerated simulation. Our CGAN-based model showed high CC and favorable results in the GOCI VIS and NIR bands. Consequently, our study demonstrates the possibility of applying a deep learning technique to improve the temporal resolution for ocean color studies using the GOCI sensor.