We propose a method to identify and localize object classes in images. Instead of operating at the pixel level, we advocate the use of superpixels as the basic unit of a class segmentation or pixel localization scheme. To this end, we construct a classifier on the histogram of local features found in each superpixel. We regularize this classifier by aggregating histograms in the neighborhood of each superpixel and then refine our results further by using the classifier in a conditional random field operating on the superpixel graph. Our proposed method exceeds the previously published state-of-the-art on two challenging datasets: Graz-02 and the PASCAL VOC 2007 Segmentation Challenge.
Abstract. We present an approach to determine the category and location of objects in images. It performs very fast categorization of each pixel in an image, a brute-force approach made feasible by three key developments: First, our method reduces the size of a large generic dictionary (on the order of ten thousand words) to the low hundreds while increasing classification performance compared to k-means. This is achieved by creating a discriminative dictionary tailored to the task by following the information bottleneck principle. Second, we perform feature-based categorization efficiently on a dense grid by extending the concept of integral images to the computation of local histograms. Third, we compute SIFT descriptors densely in linear time. We compare our method to the state of the art and find that it excels in accuracy and simplicity, performing better while assuming less.
VLFeat is an open and portable library of computer vision algorithms. It aims at facilitating fast prototyping and reproducible research for computer vision scientists and students. It includes rigorous implementations of common building blocks such as feature detectors, feature extractors, (hierarchical) k-means clustering, randomized kd-tree matching, and super-pixelization. The source code and interfaces are fully documented. The library integrates directly with Matlab, a popular language for computer vision research.
Inside the mammalian nose lies a labyrinth of bony plates covered in epithelium collectively known as turbinates. Respiratory turbinates lie anteriorly and aid in heat and water conservation, while more posterior olfactory turbinates function in olfaction. Previous observations on a few carnivorans revealed that aquatic species have relatively large, complex respiratory turbinates and greatly reduced olfactory turbinates compared with terrestrial species. Body heat is lost more quickly in water than air and increased respiratory surface area likely evolved to minimize heat loss. At the same time, olfactory surface area probably diminished due to a decreased reliance on olfaction when foraging under water. To explore how widespread these adaptations are, we documented scaling of respiratory and olfactory turbinate surface area with body size in a variety of terrestrial, freshwater, and marine carnivorans, including pinnipeds, mustelids, ursids, and procyonids. Surface areas were estimated from high-resolution CT scans of dry skulls, a novel approach that enabled a greater sampling of taxa than is practical with fresh heads. Total turbinate, respiratory, and olfactory surface areas correlate well with body size (r 2 ‡ 0.7), and are relatively smaller in larger species. Relative to body mass or skull length, aquatic species have significantly less olfactory surface area than terrestrial species. Furthermore, the ratio of olfactory to respiratory surface area is associated with habitat. Using phylogenetic comparative methods, we found strong support for convergence on 1 : 3 proportions in aquatic taxa and near the inverse in terrestrial taxa, indicating that aquatic mustelids and pinnipeds independently acquired similar proportions of olfactory to respiratory turbinates. Constraints on turbinate surface area in the nasal chamber may result in a trade-off between respiratory and olfactory function in aquatic mammals.
No abstract
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.