Remote testing of auditory function can be transformative to both basic research and hearing healthcare; however, historically, many obstacles have limited remote collection of reliable and valid auditory psychometric data. Here, we report performance on a battery of auditory processing tests using a remotely administered system, Portable Automatic Rapid Testing. We compare a previously reported dataset collected in a laboratory setting with the same measures using uncalibrated, participant-owned devices in remote settings (experiment 1, n = 40) remote with and without calibrated hardware (experiment 2, n = 36) and laboratory with and without calibrated hardware (experiment 3, n = 58). Results were well-matched across datasets and had similar reliability, but overall performance was slightly worse than published norms. Analyses of potential nuisance factors such as environmental noise, distraction, or lack of calibration failed to provide reliable evidence that these factors contributed to the observed variance in performance. These data indicate feasibility of remote testing of suprathreshold auditory processing using participants' own devices. Although the current investigation was limited to young participants without hearing difficulties, its outcomes demonstrate the potential for large-scale, remote hearing testing of more hearing-diverse populations both to advance basic science and to establish the clinical viability of auditory remote testing.
While remote data collection is not a new concept, the quality and psychometric properties of data collected remotely often remain unclear. Most remote data collection is done via online survey tools or web-conferencing applications (i.e., Skype or Zoom) and largely involves questionnaires, interviews, or other self-report data. Little research has been done on the collection of cognitive assessments and interventions via web-conferencing that requires multiple sessions with or without the assistance of an experimenter. The present paper discusses limitations and challenges of studies administered remotely, and outlines methods used to overcome such challenges while effectively collecting cognitive performance data remotely via Zoom. We further discuss relative recruitment, retention rates, compliance, and performance findings between in-lab and remotely administered cognitive assessment and intervention studies, as well as limitations to remote data collection. We found that while it was necessary to recruit more participants in remote studies to reach enrollment goals, compliance and performance were largely comparable between in-lab and remotely administered studies, illustrating the opportunities of conducting this type of experimental research remotely with adequate fidelity.
Measuring selective attention in a speeded task can provide valuable insight into the concentration ability of an individual, and can inform neuropsychological assessment of attention in aging, traumatic brain injury, and in various psychiatric disorders. There are only a few tools to measure selective attention that are freely available, psychometrically validated, and can be used flexibly both for in-person and remote assessment. To address this gap, we developed a self-administrable, mobile-based test called “UCancellation” (University of California Cancellation), which was designed to assess selective attention and concentration and has two stimulus sets: Letters and Pictures. UCancellation takes less than 7 minutes to complete, is automatically scored, has multiple forms to allow repeated testing, and is compatible with a variety of iOS and Android devices. Here we report the results of a study that examined parallel-test reliability and convergent validity of UCancellation in a sample of 104 college students. UCancellation Letters and Pictures showed adequate parallel test reliability ( r = .71–.83, p < 0.01) and internal consistency (ɑ = .73–.91). It also showed convergent validity with another widely used cancellation task, d2 Test of Attention ( r = .43–.59, p < 0.01), and predicted performance on a cognitive control composite ( r = .34–.41, p < 0.05). These results suggest that UCancellation is a valid test of selective attention and inhibitory control, which warrants further data collection to establish norms. Supplementary Information The online version contains supplementary material available at 10.3758/s13428-021-01765-5.
Understanding speech in the presence of acoustical competition is a major complaint of those with hearing difficulties. Here, a novel perceptual learning game was tested for its effectiveness in reducing difficulties with hearing speech in competition. The game was designed to train a mixture of auditory processing skills thought to underlie speech in competition, such as spectral-temporal processing, sound localization, and auditory working memory. Training on these skills occurred both in quiet and in competition with noise. Thirty college-aged participants without any known hearing difficulties were assigned either to this mixed-training condition or an active control consisting of frequency discrimination training within the same gamified setting. To assess training effectiveness, tests of speech in competition (primary outcome), as well as basic supra-threshold auditory processing and cognitive processing abilities (secondary outcomes) were administered before and after training. Results suggest modest improvements on speech in competition tests in the mixed-training compared to the frequency-discrimination control condition (Cohen’s d = 0.68). While the sample is small, and in normally hearing individuals, these data suggest promise of future study in populations with hearing difficulties.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.