The imaging of objects using high-resolution detectors coupled to CT systems may be made challenging due to the presence of ring artifacts in the reconstructed data. Not only are the artifacts qualitatilvely distracting, they reduce the SNR of the reconstructed data and may lead to a reduction in the clinical utility of the image data. To address these challenges, we introduce a multistep algorithm that greatly reduces the impact of the ring artifacts on the reconstructed data through image processing in the sinogram space. First, for a single row of detectors corresponding to one slice, we compute the mean of every detector element in the row across all projection view angles and place the reciprocal values in a vector with length equal to the number of detector elements in a row. This vector is then multiplied with each detector element value for each projection view angle, obtaining a normalized or corrected sinogram. This sinogram is subtracted from the original uncorrected sinogram of the slice to obtain a difference map, which is then blurred with a median filter along the row direction. This blurred difference map is summed back to the corrected sinogram, to obtain the final sinogram, which can be back projected to obtain an axial slice of the scanned object, with a greatly reduced presence of ring artifacts. This process is done for each detector row corresponding to each slice. The performance of this algorithm was assessed using images of a mouse femur. These images were acquired using a micro-CT system coupled to a high-resolution CMOS detector. We found that the use of this algorithm led to an increase in SNR and a more uniform line-profile, as a result of the reduction in the presence of the ring artifacts.