Image retrieval is a significant and hot research topic among researchers that drives the focus of researchers from keyword toward semantic-based image reconstruction. Nevertheless, existing image retrieval investigations still have a shortage of significant semantic image definition and user behavior consideration. Hence, there is a necessity to offer a high level of assistance towards regulating the semantic gap between low-level visual patterns and high-level ideas for a better understanding between humans and machines. Hence, this research devises an effective medical image retrieval strategy using convoluted neighborhood-based Ordered-dither block truncation coding (ODBTC). The developed approach is devised by modifying the ODBTC concept using a convoluted neighborhood mechanism. Here, the convoluted neighborhood-based color co-occurrence feature (CCF) and convoluted neighborhood-based bit pattern feature (BBF) are extracted. Finally, cross-indexing is performed to convert the feature points into binary codes for effective image retrieval. Meanwhile, the proposed convoluted neighborhood-based ODBTC has achieved maximum precision, recall, and f-measure with values of 0.740, 0.680, and 0.709.