Nowadays the UX design become on a next level. Together with new way of interaction are introduced as finger and hand movement. The technology offer and thought-driven approach with so called brain-computer interface (BCI). This possibility opens new challenges for uses as well as for designers and researchers.More than 15 years there are devices for brain signal interception, such as EMotiv Epoc, Neurosky headset and others. The reliable translation of user commands to the app on a global scale, with no leaps in advancement for its lifetime, is a challenge. It is still un-solve for modern scientists and software developers. Success requires the effective interaction of many adaptive controllers: the user's brain, which produces brain activity that encodes intent; the BCI system, which translates that activity into the digital signals; the accuracy of aforementioned system, computer algorithms to translate the brain signals to commands. In order to find out this complex and monumental task, many teams are exploring a variety of signal analysis techniques to improve the adaptation of the BCI system to the user. Rarely there are publications, in which are described the used methods, steps and algorithms for discerning varying commands, words, signals and etc.This article describes one approach to the retrieval, analysis and processing of the received signals. These data are the result of researching the capabilities of Arduino robot management through the brain signals received by BCI.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.