Noise is a proven cause of wakefulness and qualitative sleep disturbance in critically ill patients. A sound pressure level reduction can improve sleep quality, but there are no studies showing the feasibility of such a noise reduction in the intensive care unit (ICU) setting. Considering all available evidence, we redesigned two ICU rooms with the aim of investigating the physiological and clinical impact of a healing environment, including a noise reduction and day-night variations of sound level. Within an experimental design, we recorded 96 h of sound-pressure levels in standard ICU rooms and the modified ICU rooms. In addition, we performed a sound source observation by human observers. Our results show that we reduced A-weighted equivalent sound pressure levels and maximum sound pressure levels with our architectural interventions. During night-time, the modification led to a significant decrease in 50 dB threshold overruns from 65.5% to 39.9% (door side) and from 50% to 10.5% (window side). Sound peaks of more than 60 decibels were significantly reduced from 62.0% to 26.7% (door side) and 59.3% to 30.3% (window side). Time-series analysis of linear trends revealed a significantly more distinct day-night pattern in the modified rooms with lower sound levels during night-times. Observed sound sources during night revealed four times as many talking events in the standard room compared to the modified room. In summary, we show that it is feasible to reduce sound pressure levels using architectural modifications.
Brain-computer interfaces provide a means for controlling a device by brain activity alone. One major drawback of noninvasive BCIs is their low information transfer rate, obstructing a wider deployment outside the lab. BCIs based on codebook visually evoked potentials (cVEP) outperform all other state-of-the-art systems in that regard. Previous work investigated cVEPs for spelling applications. We present the first cVEP-based BCI for use in real-world settings to accomplish everyday tasks such as navigation or action selection. To this end, we developed and evaluated a cVEP-based on-line BCI that controls a virtual agent in a simulated, but realistic, 3-D kitchen scenario. We show that cVEPs can be reliably triggered with stimuli in less restricted presentation schemes, such as on dynamic, changing backgrounds. We introduce a novel, dynamic repetition algorithm that allows for optimizing the balance between accuracy and speed individually for each user. Using these novel mechanisms in a 12-command cVEP-BCI in the 3-D simulation results in ITRs of 50 bits/min on average and 68 bits/min maximum. Thus, this work supports the notion of cVEP-BCIs as a particular fast and robust approach suitable for real-world use.
The co-registration of eye tracking and electroencephalography provides a holistic measure of ongoing cognitive processes. Recently, fixation-related potentials have been introduced to quantify the neural activity in such bi-modal recordings. Fixation-related potentials are time-locked to fixation onsets, just like event-related potentials are locked to stimulus onsets. Compared to existing electroencephalography-based brain-machine interfaces that depend on visual stimuli, fixation-related potentials have the advantages that they can be used in free, unconstrained viewing conditions and can also be classified on a single-trial level. Thus, fixation-related potentials have the potential to allow for conceptually different brain-machine interfaces that directly interpret cortical activity related to the visual processing of specific objects. However, existing research has investigated fixation-related potentials only with very restricted and highly unnatural stimuli in simple search tasks while participant’s body movements were restricted. We present a study where we relieved many of these restrictions while retaining some control by using a gaze-contingent visual search task. In our study, participants had to find a target object out of 12 complex and everyday objects presented on a screen while the electrical activity of the brain and eye movements were recorded simultaneously. Our results show that our proposed method for the classification of fixation-related potentials can clearly discriminate between fixations on relevant, non-relevant and background areas. Furthermore, we show that our classification approach generalizes not only to different test sets from the same participant, but also across participants. These results promise to open novel avenues for exploiting fixation-related potentials in electroencephalography-based brain-machine interfaces and thus providing a novel means for intuitive human-machine interaction.
Abstract-We propose an on-line hybrid BCI system that combines P300 and ERD. By employing both brain activity patterns (BAPs) in parallel and asynchronously, the system can issue different types of commands, for example, in robotic control scenarios. We present a method for reliably distinguishing between the two BAPs. We examined the level of false positives in P300 classification while a subject tries to evoke an ERD. We found this level to be as low as for regular P300 trials. Our system thus assumes the presence of ERD whenever classification of all P300 symbols is negative. Empirical results indicate that subjects can achieve good control over the hybrid BCI. In particular, subjects can switch spontaneously and reliably between the two BAPs.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.