Recent anecdotal and scientific reports have provided evidence of a link between COVID-19 and chemosensory impairments such as anosmia. However, these reports have downplayed or failed to distinguish potential effects on taste, ignored chemesthesis, and generally lacked quantitative measurements. Here, we report the development, implementation and initial results of a multi-lingual, international questionnaire to assess self-reported quantity and quality of perception in three distinct chemosensory modalities (smell, taste, and chemesthesis) before and during COVID-19. In the first 11 days after questionnaire launch, 4039 participants (2913 women, 1118 men, 8 other, ages 19-79) reported a COVID-19 diagnosis either via laboratory tests or clinical assessment. Importantly, smell, taste and chemesthetic function were each significantly reduced compared to their status before the disease. Difference scores (maximum possible change ±100) revealed a mean reduction of smell (-79.7 ± 28.7, mean ± SD), taste (-69.0 ± 32.6), and chemesthetic (-37.3 ± 36.2) function during COVID-19. Qualitative changes in olfactory ability (parosmia and phantosmia) were relatively rare and correlated with smell loss. Importantly, perceived nasal obstruction did not account for smell loss. Furthermore, chemosensory impairments were similar between participants in the laboratory test and clinical assessment groups. These results show that COVID-19-associated chemosensory impairment is not limited to smell, but also affects taste and chemesthesis. The multimodal impact of COVID-19 and lack of perceived nasal obstruction suggest that SARS-CoV-2 infection may disrupt sensory-neural mechanisms.
In a preregistered, cross-sectional study we investigated whether olfactory loss is a reliable predictor of COVID-19 using a crowdsourced questionnaire in 23 languages to assess symptoms in individuals self-reporting recent respiratory illness. We quantified changes in chemosensory abilities during the course of the respiratory illness using 0-100 visual analog scales (VAS) for participants reporting a positive (C19+; n=4148) or negative (C19-; n=546) COVID-19 laboratory test outcome. Logistic regression models identified univariate and multivariate predictors of COVID-19 status and post-COVID-19 olfactory recovery. Both C19+ and C19- groups exhibited smell loss, but it was significantly larger in C19+ participants (mean±SD, C19+: -82.5±27.2 points; C19-: -59.8±37.7). Smell loss during illness was the best predictor of COVID-19 in both univariate and multivariate models (ROC AUC=0.72). Additional variables provide negligible model improvement. VAS ratings of smell loss were more predictive than binary chemosensory yes/no-questions or other cardinal symptoms (e.g., fever). Olfactory recovery within 40 days of respiratory symptom onset was reported for ~50% of participants and was best predicted by time since respiratory symptom onset. We find that quantified smell loss is the best predictor of COVID-19 amongst those with symptoms of respiratory illness. To aid clinicians and contact tracers in identifying individuals with a high likelihood of having COVID-19, we propose a novel 0-10 scale to screen for recent olfactory loss, the ODoR-19. We find that numeric ratings ≤2 indicate high odds of symptomatic COVID-19 (4<OR<10). Once independently validated, this tool could be deployed when viral lab tests are impractical or unavailable.
The exemplary search capabilities of flying insects have established them as one of the most diverse taxa on Earth. However, we still lack the fundamental ability to quantify, represent, and predict trajectories under natural contexts to understand search and its applications. For example, flying insects have evolved in complex multimodal three-dimensional (3D) environments, but we do not yet understand which features of the natural world are used to locate distant objects. Here, we independently and dynamically manipulate 3D objects, airflow fields, and odor plumes in virtual reality over large spatial and temporal scales. We demonstrate that flies make use of features such as foreground segmentation, perspective, motion parallax, and integration of multiple modalities to navigate to objects in a complex 3D landscape while in flight. We first show that tethered flying insects of multiple species navigate to virtual 3D objects. Using the apple flyRhagoletis pomonella, we then measure their reactive distance to objects and show that these flies use perspective and local parallax cues to distinguish and navigate to virtual objects of different sizes and distances. We also show that apple flies can orient in the absence of optic flow by using only directional airflow cues, and require simultaneous odor and directional airflow input for plume following to a host volatile blend. The elucidation of these features unlocks the opportunity to quantify parameters underlying insect behavior such as reactive space, optimal foraging, and dispersal, as well as develop strategies for pest management, pollination, robotics, and search algorithms.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.