Sleep deprivation and/or a high workload situation can adversely affect driving performance, decreasing a driver's capacity to respond effectively in dangerous situations. In this context, to provide useful feedback and alert signals in real time to the drivers physiological and brain activities have been increasingly investigated in literature. In this study, we analyze the increase of cerebral workload and the insurgence of drowsiness during car driving in a simulated environment by using high resolution electroencephalographic techniques (EEG) as well as neurophysiologic variables such as heart rate (HR) and eye blinks rate (EBR). The simulated drive tasks were modulated with five levels of increasing difficulty. A workload index was then generated by using the EEG signals and the related HR and EBR signals. Results suggest that the derived workload index is sensitive to the mental efforts of the driver during the different drive tasks performed. Such workload index was based on the estimation the variation of EEG power spectra in the theta band over prefrontal cortical areas and the variation of the EEG power spectra over the parietal cortical areas in alpha band. In addition, results suggested as HR increases during the execution of the difficult driving tasks while instead it decreases at the insurgence of the drowsiness. Finally, the results obtained showed as the EBR variable increases of its values when the insurgence of drowsiness in the driver occurs. The proposed workload index could be then used in a near future to assess on-line the mental state of the driver during a drive task.
SWift (SignWriting improved fast transcriber) is an advanced editor for SignWriting (SW). At present, SW is a promising alternative to provide documents in an easyto-grasp written form of (any) Sign Language, the gestural way of communication which is widely adopted by the deaf community. SWift was developed for SW users, either deaf or not, to support collaboration and exchange of ideas. The application allows composing and saving the desired signs using elementary components called glyphs. The procedure that was devised guides and simplifies the editing process. SWift aims at breaking the "electronic" barriers that keep the the deaf community away from ICT in general, and from e-learning in particular. The editor can be contained in a pluggable module; therefore, it can be integrated everywhere the use of SW might is an advisable alternative to written "verbal" language, which often hinders information grasping by deaf users.
Deaf people are more heavily affected by the digital divide than many would expect. Moreover, most accessibility guidelines addressing their needs just deal with captioning and audio-content transcription. However, this approach to the problem does not consider that deaf people have big troubles with vocal languages, even in their written form. At present, only a few organizations, like W3C, produced guidelines dealing with one of their most distinctive expressions: Sign Language (SL). SL is, in fact, the visual-gestural language used by many deaf people to communicate with each other. The present work aims at supporting e-learning user experience (e-LUX) for these specific users by enhancing the accessibility of content and container services. In particular, we propose preliminary solutions to tailor activities which can be more fruitful when performed in one's own "native" language, which for most deaf people, especially younger ones, is represented by national SL.
An important field for model-driven development of interfaces is the consideration of users with disabilities. Interface design for deaf people presents specific problems, since it needs to be based on visual communication, incorporating unusual forms of interaction, in particular gesture-based ones. Standard solutions for model-driven development of visual interfaces lack specific constructs for structuring these more sophisticated forms of interaction. This paper discusses such issues in the context of the development of a deaf-centered e-learning environment. Sign Languages enter this context as a suitable alternative communication code, both in video form and through one of their most successful written forms, namely SignWriting. © 2012 Springer-Verlag
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.