Remote sensing images are prone to significant variations in photometry due to changing seasons, illumination, and atmospheric conditions. These variations often result in visible stitching seams at the edges of mosaic images, which can impact the visual quality and interpretation of the data. To address color inconsistencies in remote sensing images, conventional methods rely on absolute radiation correction and relative radiation normalization. However, these approaches may not be effective in handling complex variations and producing visually pleasing outcomes. This paper introduces a novel approach based on Neural Radiance Fields (NeRF) for correcting color inconsistencies in multi-view images. Our method leverages implicit expressions and re-illumination of the feature space to capture the intrinsic radiance and reflectance properties of the scene. By intricately weaving image features together, we generate a fusion image that seamlessly integrates color information from multiple views, resulting in improved color consistency and reduced stitching seams. To evaluate the effectiveness of our approach, we conducted experiments using satellite and UAV images with significant variations in range and time. The experimental results demonstrate that our NeRF-based method produces synthesized images with exceptional visual effects and smooth color transitions at the edges. The fusion images exhibit enhanced color consistency, effectively reducing visible stitching seams and elevating the overall image quality.