BackgroundHead-mounted displays (HMDs) and virtual reality (VR) have been frequently used in recent years, and a user’s experience and computation efficiency could be assessed by mounting eye-trackers. However, in addition to visually induced motion sickness (VIMS), eye fatigue has increasingly emerged during and after the viewing experience, highlighting the necessity of quantitatively assessment of the detrimental effects. As no measurement method for the eye fatigue caused by HMDs has been widely accepted, we detected parameters related to optometry test. We proposed a novel computational approach for estimation of eye fatigue by providing various verifiable models.ResultsWe implemented three classifications and two regressions to investigate different feature sets, which led to present two valid assessment models for eye fatigue by employing blinking features and eye movement features with the ground truth of indicators for optometry test. Three graded results and one continuous result were provided by each model, respectively, which caused the whole result to be repeatable and comparable.ConclusionWe showed differences between VIMS and eye fatigue, and we also presented a new scheme to assess eye fatigue of HMDs users by analysis of parameters of the eye tracker.
Coronavirus Disease 2019 (COVID-19) caused by severe acute respiratory syndrome coronaviruses 2 (SARS-CoV-2) has become a serious global pandemic in the past few months and caused huge loss to human society worldwide. For such a large-scale pandemic, early detection and isolation of potential virus carriers is essential to curb the spread of the pandemic. Recent studies have shown that one important feature of COVID-19 is the abnormal respiratory status caused by viral infections. During the pandemic, many people tend to wear masks to reduce the risk of getting sick. Therefore, in this paper, we propose a portable non-contact method to screen the health conditions of people wearing masks through analysis of the respiratory characteristics from RGB-infrared sensors. We first accomplish a respiratory data capture technique for people wearing masks by using face recognition. Then, a bidirectional GRU neural network with an attention mechanism is applied to the respiratory data to obtain the health screening result. The results of validation experiments show that our model can identify the health status of respiratory with 83.69% accuracy, 90.23% sensitivity and 76.31% specificity on the real-world dataset. This work demonstrates that the proposed RGB-infrared sensors on portable device can be used as a pre-scan method for respiratory infections, which provides a theoretical basis to encourage controlled clinical trials and thus helps fight the current COVID-19 pandemic.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.