Search citation statements
Paper Sections
Citation Types
Year Published
Publication Types
Relationship
Authors
Journals
Estimated vital signs might include a variety of measurements that can be used in detecting any abnormal conditions by analyzing facial images from continuous monitoring with a thermal video camera. To overcome the limitless human visual perceptions, thermal infrared has proven to be the most effective technique for visualizing facial colour changes that could have been reflected by changes in oxygenation levels and blood volume in facial arteries. This study investigated the possibility of vital signs estimation using physiological function images converted from the thermal infrared images in the same ways that visible images are used, with a need for an efficient extractor method as correction procedures that have used datasets that include images with and without wearing glasses or protective face masks. This paper, summarize thermal images using advanced machine learning and deep learning methods with satisfactory performance. Also, we presented the evaluation matrices that were included in the assessment based on statistical analysis, accuracy measures and error measures. Finally, to discuss future gaps and directions for further evaluations.INDEX TERMS Thermal images, features extractions, vital signs estimation, evaluation matrices.
Estimated vital signs might include a variety of measurements that can be used in detecting any abnormal conditions by analyzing facial images from continuous monitoring with a thermal video camera. To overcome the limitless human visual perceptions, thermal infrared has proven to be the most effective technique for visualizing facial colour changes that could have been reflected by changes in oxygenation levels and blood volume in facial arteries. This study investigated the possibility of vital signs estimation using physiological function images converted from the thermal infrared images in the same ways that visible images are used, with a need for an efficient extractor method as correction procedures that have used datasets that include images with and without wearing glasses or protective face masks. This paper, summarize thermal images using advanced machine learning and deep learning methods with satisfactory performance. Also, we presented the evaluation matrices that were included in the assessment based on statistical analysis, accuracy measures and error measures. Finally, to discuss future gaps and directions for further evaluations.INDEX TERMS Thermal images, features extractions, vital signs estimation, evaluation matrices.
Infra-red thermography (IRT) offers potential opportunities as a tool for disease detection in livestock. Despite considerable research in this area, there are no common standards or protocols for managing IRT parameters in animal disease detection research. In this review, we investigate parameters that are essential to the progression of this tool and make recommendations for their use based on the literature found and the veterinary thermography guidelines from the American Academy of Thermology. We analyzed a defined set of 109 articles concerned with the use of IRT in livestock related to disease and from these articles, parameters for accurate IRT were identified and sorted into the fields of camera-, animal- or environment-related categories to assess the practices of each article in reporting parameters. This review demonstrates the inconsistencies in practice across peer-reviewed articles and reveals that some important parameters are completely unreported while others are incorrectly captured and/or under-represented in the literature. Further to this, our review highlights the lack of measured emissivity values for live animals in multiple species. We present guidelines for the standards of parameters that should be used and reported in future experiments and discuss potential opportunities and challenges associated with using IRT for disease detection in livestock.
This paper focuses on improving automated approaches to glaucoma diagnosis, a severe disease that leads to gradually narrowing vision and potentially blindness due to optic nerve damage occurring without the patient’s awareness. Early diagnosis is crucial. By utilizing advanced deep learning technologies and robust image processing capabilities, this study employed four types of input data (retina fundus image, region of interest (ROI), vascular region of interest (VROI), and color palette images) to reflect structural issues. We addressed the issue of data imbalance with a modified loss function and proposed an ensemble model based on the vision large language model (VLLM), which improved the accuracy of glaucoma classification. The results showed that the models developed for each dataset achieved 1% to 10% higher accuracy and 8% to 29% improved sensitivity compared to conventional single-image analysis. On the REFUGE dataset, we achieved a high accuracy of 0.9875 and a sensitivity of 0.9. Particularly in the ORIGA dataset, which is challenging in terms of achieving high accuracy, we confirmed a significant increase, with an 11% improvement in accuracy and a 29% increase in sensitivity. This research can significantly contribute to the early detection and management of glaucoma, indicating potential clinical applications. These advancements will not only further the development of glaucoma diagnostic technologies but also play a vital role in improving patients’ quality of life.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.