BackgroundData mining is an active area of research for mining or retrieving data/information from a large database or library. Image retrieval is the part of data mining in which the visual information (images) from the large size database or library is retrieved. Earlier, text-based retrieval was used for retrieving the information. In this process, the images are annotated with text and then text-based database management systems were used to perform image retrieval. Many advances, such as data modelling, multidimensional indexing, and query evaluation, have been made along this research direction. However, there exist two major difficulties, especially when the size of image collections is too large (tens or hundreds of thousands). One is the vast amount of labor required to annotate the images manually. The other difficulty, which is more essential, araise due to the rich content in the images and the subjectivity of human perception. That is, for Abstract This paper proposes a novel feature descriptor, named local quantized extrema patterns (LQEP) for content based image retrieval. The standard local quantized patterns (LQP) collect the directional relationship between the center pixel and its surrounding neighbors and the directional local extrema patterns (DLEP) collect the directional information based on local extrema in 0°, 45°, 90°, and 135° directions for a given center pixel in an image. In this paper, the concepts of LQP and DLEP are integrated to propose the LQEP for image retrieval application. First, the directional quantized information is collected from the given image. Then, the directional extrema is collected from the quantized information. Finally, the RGB color histogram is integrated with the LQEP for a feature vector generation. The performance of the proposed method is tested by conducting three experiments on Coel-1K, Corel-5K and MIT VisTex databases for natural and texture image retrieval. The performance of the proposed method is evaluated in terms of precision, recall, average retrieval precision and average retrieval rate on benchmark databases. The results after investigation show a considerable improvements in terms of their evaluation measures as compared to the existing methods on respective databases.
In this paper, we propose a new feature descriptor, named local mesh quantized extrema patterns (LMeQEP) for image indexing and retrieval. The standard local quantized patterns collect the spatial relationship in the form of larger or deeper texture pattern based on the relative variations in the gray values of center pixel and its neighbors. Directional local extrema patterns explore the directional information in 0°, 90°, 45° and 135° for a pixel positioned at the center. A mesh structure is created from a quantized extrema to derive significant textural information. Initially, the directional quantized data from the mesh structure is extracted to form LMeQEP of given image. Then, RGB color histogram is built and integrated with the LMeQEP to enhance the performance of the system. In order to test the impact of proposed method, experimentation is done with bench mark image repositories such as MIT VisTex and Corel-1k. Avg. retrieval rate and avg. retrieval precision are considered as the evaluation metrics to record the performance level. The results from experiments show a considerable improvement when compared to other recent techniques in the image retrieval.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.