Before a person suffering from a traumatic brain injury reaches a medical facility, measuring their pupillary light reflex (PLR) is one of the few quantitative measures a clinician can use to predict their outcome. We propose PupilScreen, a smartphone app and accompanying 3D-printed box that combines the repeatability, accuracy, and precision of a clinical device with the ubiquity and convenience of the penlight test that clinicians regularly use in emergency situations. The PupilScreen app stimulates the patient's eyes using the smartphone's flash and records the response using the camera. The PupilScreen box, akin to a head-mounted virtual reality display, controls the eyes' exposure to light. The recorded video is processed using convolutional neural networks that track the pupil diameter over time, allowing for the derivation of clinically relevant measures. We tested two different network architectures and found that a fully convolutional neural network was able to track pupil diameter with a median error of 0.30 mm. We also conducted a pilot clinical evaluation with six patients who had suffered a TBI and found that clinicians were almost perfect when separating unhealthy pupillary light reflexes from healthy ones using PupilScreen alone.
Background Blood typing, donor compatibility testing, and hematocrit analysis are common tests that are important in many clinical applications, including those found in high-stakes settings such as the trauma center. These tests are typically performed in centralized laboratories with sample batching; the minutes that are lost in this mode can lead to adverse outcomes, especially for critical-care patients. As a step toward providing rapid results at the bedside, we developed a point-of-care hemagglutination system relying on digital microfluidics (DMF) and a unique, automated readout tool, droplet agglutination assessment using digital microfluidics (DAAD). Methods ABO and Rhesus blood grouping, donor crossmatching, and hematocrit assays were developed on a portable DMF platform that allowed for automated sample processing. The result of each assay could be determined by eye or automatically with the DAAD imaging tool. Results DMF-DAAD was applied to 109 samples collected from different sources (including commercial samples, pinpricks from volunteers, and a hospital blood bank), with perfect fidelity to gold-standard results. Some of these tests were carried out by a nonexpert in a hospital trauma center. Proof-of-concept results were also collected from smaller sample sets for donor compatibility testing and hematocrit analysis. Conclusion DMF-DAAD shows promise for delivering rapid, reliable results in a format well suited for a trauma center and other settings where every minute counts.
Pancreatic cancer has one of the worst survival rates amongst all forms of cancer because its symptoms manifest later into the progression of the disease. One of those symptoms is jaundice, the yellow discoloration of the skin and sclera due to the buildup of bilirubin in the blood. Jaundice is only recognizable to the naked eye in severe stages, but a ubiquitous test using computer vision and machine learning can detect milder forms of jaundice. We propose BiliScreen, a smartphone app that captures pictures of the eye and produces an estimate of a person's bilirubin level, even at levels normally undetectable by the human eye. We test two low-cost accessories that reduce the effects of external lighting: (1) a 3D-printed box that controls the eyes' exposure to light and (2) paper glasses with colored squares for calibration. In a 70-person clinical study, we found that BiliScreen with the box achieves a Pearson correlation coefficient of 0.89 and a mean error of -0.09 ± 2.76 mg/dl in predicting a person's bilirubin level. As a screening tool, BiliScreen identifies cases of concern with a sensitivity of 89.7% and a specificity of 96.8% with the box accessory.
No abstract
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.