Most research investigating auditory perception is conducted in controlled laboratory settings, potentially restricting its generalizability to the complex acoustic environment outside the lab. The present study, in contrast, investigated auditory attention with long-term recordings (> 6 h) beyond the lab using a fully mobile, smartphone-based ear-centered electroencephalography (EEG) setup with minimal restrictions for participants. Twelve participants completed iterations of two variants of an oddball task where they had to react to target tones and to ignore standard tones. A rapid variant of the task (tones every 2 s, 5 min total time) was performed seated and with full focus in the morning, around noon and in the afternoon under controlled conditions. A sporadic variant (tones every minute, 160 min total time) was performed once in the morning and once in the afternoon while participants followed their normal office day routine. EEG data, behavioral data, and movement data (with a gyroscope) were recorded and analyzed. The expected increased amplitude of the P3 component in response to the target tone was observed for both the rapid and the sporadic oddball. Miss rates were lower and reaction times were faster in the rapid oddball compared to the sporadic one. The movement data indicated that participants spent most of their office day at relative rest. Overall, this study demonstrated that it is feasible to study auditory perception in everyday life with long-term ear-EEG.
With smartphone-based mobile electroencephalography (EEG), we can investigate sound perception beyond the lab. To understand sound perception in the real world, we need to relate naturally occurring sounds to EEG data. For this, EEG and audio information need to be synchronized precisely, only then it is possible to capture fast and transient evoked neural responses and relate them to individual sounds. We have developed Android applications (AFEx and Record-a) that allow for the concurrent acquisition of EEG data and audio features, i.e., sound onsets, average signal power (RMS), and power spectral density (PSD) on smartphone. In this paper, we evaluate these apps by computing event-related potentials (ERPs) evoked by everyday sounds. One participant listened to piano notes (played live by a pianist) and to a home-office soundscape. Timing tests showed a stable lag and a small jitter (< 3 ms) indicating a high temporal precision of the system. We calculated ERPs to sound onsets and observed the typical P1-N1-P2 complex of auditory processing. Furthermore, we show how to relate information on loudness (RMS) and spectra (PSD) to brain activity. In future studies, we can use this system to study sound processing in everyday life.
The streaming and recording of smartphone sensor signals is desirable for mHealth, telemedicine, environmental monitoring and other applications. Time series data gathered in these fields typically benefit from the time-synchronized integration of different sensor signals. However, solutions required for this synchronization are mostly available for stationary setups. We hope to contribute to the important emerging field of portable data acquisition by presenting open-source Android applications both for the synchronized streaming (Send-a) and recording (Record-a) of multiple sensor data streams. We validate the applications in terms of functionality, flexibility and precision in fully mobile setups and in hybrid setups combining mobile and desktop hardware. Our results show that the fully mobile solution is equivalent to well-established desktop versions. With the streaming application Send-a and the recording application Record-a, purely smartphone-based setups for mobile research and personal health settings can be realized on off-the-shelf Android devices.
Most research investigating auditory perception is conducted in controlled laboratory settings, potentially restricting its generalizability to the complex acoustic environment outside the lab. The present study, in contrast, investigated auditory attention with long-term recordings (>6 h) beyond the lab using a fully mobile, smartphone-based ear-centered electroencephalography (EEG) setup with minimal restrictions for participants. Twelve participants completed iterations of two variants of an oddball task where they had to react to target tones and to ignore standard tones. A rapid variant of the task (tones every 2 seconds, 5 minutes total time) was performed seated and with full focus in the morning, around noon and in the afternoon under controlled conditions. A sporadic variant (tones every minute, 160 minutes total time) was performed once in the morning and once in the afternoon while participants followed their normal office day routine. EEG data, behavioural data, and movement data (with a gyroscope) were recorded and analyzed. The expected increased amplitude of the P3 component in response to the target tone was observed for both the rapid and the sporadic oddball. Miss rates were lower and reaction times were faster in the rapid oddball compared to the sporadic one. The movement data indicated that participants spent most of their office day at relative rest. Overall, this study demonstrated that it is feasible to study auditory perception in everyday life with long-term ear-EEG.
The cEEGrid (ear-electroencephalography; ear-EEG) is an unobtrusive and comfortable electrode array affixed around the ear. It is suited to investigate brain activity outside of the laboratory for long durations. Previous research established that cEEGrids can be used to study various cognitive processes in and also beyond the lab, even for a whole day. To record high-quality ear-EEG data, careful preparation is necessary. In this protocol, we explain the steps needed for successful experimenting with cEEGrids: First, we show how to test the functionality of the cEEGrid prior to a recording. Second, we describe how to prepare the participant and to fit the cEEGrid, which is the most important step to record high-quality data. Third, we outline how to connect the cEEGrids to the amplifier and how to check the signal quality. In this protocol, we give best practice recommendations and tips that make cEEGrid recordings easier. If researchers follow this protocol, they are comprehensively equipped for experimenting with the cEEGrid in and beyond the lab.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.