The most essential process in image processing applications is eliminating the noisy pixels from the images and acquiring the noise-free images with better reconstructed quality. This process has high complexity due to the segmentation of image patches from the noisy images. To simplify this process, Neuro-Fuzzy filtering with Optimum Adaptive parameterized Mask Non-Harmonic Analysis in Curvelet Transform domain (NF-OAMNHA-CT) has been proposed for image denoising. It uses NF edge detector instead of canny edge detector for segmenting and preserving the edge regions including homogeneous texture boundaries with minimized edge distortion. However, it requires gradient information about the error function with respect to the parameter measures during training of NF for optimizing their parameters. Also, it often affects with the fixed local minima. Therefore in this article, Recurrent Polak- Ribière-Polyak Conjugate Gradient-Based Neuro-Fuzzy (RPCGNF-OAMNHA-CT) technique is proposed to enhance the edge-preserving segmentation in image denoising. At first, the noisy images are defined in the CT domain and fed to the RPCGNF edge detector as edge-preserving segmentation that segments and extracts the edge regions as well as homogeneous texture regions. In this RPCGNF technique, recurrent mechanism is applied to construct PCGNF as an updated edgepreserving segmentation technique that learns their parameters with high efficiency and speeding up the convergence. As a result, the edge regions and homogeneous textures are synchronously segmented as noisy pixels until either the PCGNF converges to the desired accuracy of segmentation or a termination criterion is reached. Further, OAMNHA is applied on each segment to remove the noisy pixels from the images and obtain the noise-free images accurately. Finally, the experimental results illustrate the proposed RPCGNFOAMNHA-CT technique achieves higher efficiency than the NFOAMNHA-CT technique in terms of Peak Signal-to-Noise Ratio (PSNR), Mean Absolute Error (MAE) and Structural Similarity (SSIM).