Color vision deficiency (CVD) is caused by anomalies in the cone cells of the human retina. It affects approximately 200 million individuals throughout the world. Although previous studies have proposed compensation methods, contrast and naturalness preservation have not been adequately and simultaneously addressed in the state-of-the-art studies. This paper focuses on red-green dichromats' compensation and proposes a recoloring algorithm that combines contrast enhancement and naturalness preservation in a unified optimization model. In this implementation, representative color extraction and edit propagation methods are introduced to maintain global and local information in the recolored image. The quantitative evaluation results showed that the proposed method is competitive with state-of-the-art methods. A subjective experiment was also conducted and the evaluation results revealed that the proposed method obtained the best scores in preserving both naturalness and information for individuals with severe red-green CVD.
Human anatomical specimen museums are commonly used by medical, nursing, and paramedical students. Through dissection and prosection, the specimens housed in these museums allow students to appreciate the complex relationships of organs and structures in more detail than textbooks could provide. However, it may be difficult for students, particularly novices, to identify the various parts of these anatomical structures without additional explanations from a docent or supplemental illustrations. Recently, augmented reality (AR) has been used in many museum exhibits to display virtual objects in videos captured from the real world. This technology can significantly enhance the learning experience. In this study, three AR‐based support systems for tours in medical specimen museums were developed, and their usability and effectiveness for learning were examined. The first system was constructed using an AR marker. This system could display virtual label information for specimens by capturing AR markers using a tablet camera. Individual AR markers were required for all specimens, but their presence in and on the prosected specimens could also be obtrusive. The second system was developed to set the specimen image itself as an image marker, as most specimens were displayed in cross section. Visitors could then obtain the label information presented by AR without any markers intruding on the display or anatomical specimens. The third system was comprised of a head‐mounted display combined with a natural click interface. The system could provide visitors with an environment for the natural manipulation of virtual objects with future scalability.
Several image recoloring methods have been proposed to compensate for the loss of contrast caused by color vision deficiency (CVD). However, these methods only work for dichromacy (a case in which one of the three types of cone cells loses its function completely), while the majority of CVD is anomalous trichromacy (another case in which one of the three types of cone cells partially loses its function). In this paper, a novel degree-adaptable recoloring algorithm is presented, which recolors images by minimizing an objective function constrained by contrast enhancement and naturalness preservation. To assess the effectiveness of the proposed method, a quantitative evaluation using common metrics and subjective studies involving 14 volunteers with varying degrees of CVD are conducted. The results of the evaluation experiment show that the proposed personalized recoloring method outperforms the stateof-the-art methods, achieving desirable contrast enhancement adapted to different degrees of CVD while preserving naturalness as much as possible.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.