Abstract-Dong et al. examined the ability of 51 computational feature sets to estimate human perceptual texture similarity, however, none performed well for this task. While it is well-known that the human visual system is extremely adept at exploiting longer-range aperiodic (and periodic) "contour" characteristics in images, none of the investigated feature sets exploit higher order statistics (HOS) over larger image regions (>19×19 pixels). We therefore hypothesise that long-range HOS, in the form of contour data, are useful for perceptual texture similarity estimation.We present the results of a psychophysical experiment that shows that contour data are more important, than local image patches, or global 2nd-order data, to human observers for this task.Inspired by this finding, we propose a set of perceptually motivated image features (PMIF) that encode the long-range HOS computed from spatial and angular distributions of contour segments. We use two perceptual texture similarity estimation tasks to compare PMIF against the 51 feature sets referred to above and four commonly used contour representations. This new feature set is also examined in the context of two additional tasks: sketch-based image retrieval and natural scene recognition. The results show that the proposed feature set performs better, or at least comparably to, all the other feature sets. We attribute this promising performance to the fact that the proposed feature set exploits both short-range and long-range HOS.