The human color sensation depends on the local and global spatial arrangements of the colors in the scene. Emulating this dependence requires the exploration of the image in search of a white reference. The algorithm Termite Retinex explores the image by a set of paths resembling traces of a swarm of termites. Starting from this approach, we develop a novel spatial exploration scheme where the termite paths are local minimums of an energy function, which depend on the image visual content. The energy is designed to favor the visitation of regions containing information relevant to the color sensation while minimizing the coverage of less essential regions. This exploration method contributes to the investigation of the spatial properties of the color sensation and, to the best of our knowledge, is the first model relying on mathematical global conditions for the Retinex paths. The experiments show that the estimation of the color sensation obtained by means of the proposed spatial sampling is a valid alternative to the one based on Termite Retinex.
Inspired by the behavior of the human visual system, spatial color algorithms perform image enhancement by correcting the pixel channel lightness based on the spatial distribution of the intensities in the surrounding area. The two visual contrast enhancement algorithms RSR and STRESS belong to this family of models: they rescale the input based on local reference values, which are determined by exploring the image by means of random point samples, called sprays. Due to the use of sampling, they may yield a noisy output. In this paper, we introduce a probabilistic formulation of the two models: our algorithms (RSR-P and STRESS-P) rely implicitly on the whole population of possible sprays. For processing larger images, we also provide two approximated algorithms that exploit a suitable target-dependent space quantization. Those spray population-based formulations outperform RSR and STRESS in terms of the processing time required for the production of noiseless outputs. We argue that this population-based approach, which can be extended to other members of the family, complements the sampling-based approach, in that it offers not only a better control in the design of approximated algorithms, but also additional insight into individual models and their relationships. We illustrate the latter point by providing a model of halo artifact formation.
In this paper we present an ultra-low-power smart visual sensor architecture. A 10.6µW low-resolution contrastbased imager featuring internal analog pre-processing is coupled with an energy-efficient quad-core cluster processor that exploits near-threshold computing within a few mW power envelope. We demonstrate the capability of the smart camera on a moving object detection framework. The computational load is distributed among mixed-signal pixel and digital parallel processing. Such local processing reduces the amount of digital data to be sent out of the node by 91%. Exploiting context aware analog circuits, the imager only dispatches meaningful post-processed data to the processing unit, lowering the sensor-to-processor bandwidth by 31x with respect to transmitting a full pixel frame. To extract high-level features, an event-driven approach is applied to the sensor data and optimized for parallel runtime execution. A 57.7x system energy saving is reached through the event-driven approach with respect to frame-based processing, on a low-power MCU node. The near-threshold parallel processor further reduces the processing energy cost by 6.64x, achieving an overall system energy cost of 1.79µJ per frame, which results to be 21.8x and up to 383x lower than, respectively, an event-based imaging system based on asynchronous visual sensor and a traditional framebased smart visual sensor.
Modeling the local color spatial distribution is a crucial step for the algorithms of the Milano Retinex family. Here we present GREAT, a novel, noise-free Milano Retinex implementation based on an image-aware spatial color sampling. For each channel of a color input image, GREAT computes a 2D set of edges whose magnitude exceeds a pre-defined threshold. Then GREAT re-scales the channel intensity of each image pixel, called target, by the average of the intensities of the selected edges weighted by a function of their positions, gradient magnitudes, and intensities relative to the target. In this way, GREAT enhances the input image, adjusting its brightness, contrast and dynamic range. The use of the edges as pixels relevant to color filtering is justified by the importance that edges play in human color sensation. The name GREAT comes from the expression "Gradient RElevAnce for ReTinex," which refers to the threshold-based definition of a gradient relevance map for edge selection and thus for image color filtering.
The human color sensation depends on the spatial distribution of the colors in the viewed scene. This principle is at the basis of the random spray Retinex (RSR) algorithm. In this work, we modify RSR by integrating its approach with a method to weight and tune the locality of spatial image information. This modification allows for spatial control of the local effect of RSR on image color filtering. We study the performances of this spatially weighted version of RSR on a public image dataset by analyzing and comparing several image features of the output image and its local properties.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.