With the recent internet connectivity revolution, and the fast-growing prevalence of camera-enabled devices, images play a vital role in several fields of modern life. Photos, which often have been seen as evidence in courts, are nowadays subject to more sophisticated tricky forgery. To detect the image stitching between originally unassociated people/scenes and other combining forgery, an algorithm used to extract multiple specific image features, such as grayscale, complementary color wavelet (CCW) based chroma, sharpness, and natural scene statistics (NSS), is first presented in this paper. It is illustrated that a random forest model can be trained by these extracted features and then be employed to classify the stitching tampered/untampered images. The experimental results show that the proposed algorithm favorably outperforms the techniques reported in the literature, and achieves a state-of-the-art performance with higher accuracy values of 91%, 95.24%, and 88.02% on the Tampering ImageNet, Columbia, and CASIA ITDE V2.0 datasets, respectively. The precision, recall, and F1-score were also improved to a certain extent.