Cockroaches can traverse unknown obstacle-terrain, self-right on the ground and climb above the obstacle. However, they have limited motion, such as less activity in light/bright areas and lower temperatures. Therefore, the movement of the cyborg cockroaches needs to be optimized for the utilization of the cockroach as a cyborg insect. This study aims to increase the search rate and distance traveled by cockroaches and reduce the stop time by utilizing automatic stimulation from machine learning. Multiple machine learning classifiers were applied to classify the offline binary classification of the cockroach movement based on the inertial measuring unit input signals. Ten time-domain features were chosen and applied as the classifier inputs. The highest performance of the classifiers was implemented for the online motion recognition and automatic stimulation provided to the cerci to trigger the free walking motion of the cockroach. A user interface was developed to run multiple computational processes simultaneously in real time such as computer vision, data acquisition, feature extraction, automatic stimulation, and machine learning using a multithreading algorithm. On the basis of the experiment results, we successfully demonstrated that the movement performance of cockroaches was importantly improved by applying machine learning classification and automatic stimulation. This system increased the search rate and traveled distance by 68% and 70%, respectively, while the stop time was reduced by 78%.
Biobot-based insects have been investigated so far for various applications such as search and rescue operations, environmental monitoring, and discovering the environment. These applications need a strong international collaboration to complete the tasks. However, during the COVID-19 pandemic, most people could not easily move from one country to another because of the travel ban. In addition, controlling biobots is challenging because only experts can operate the cockroach behavior with and without stimulated response. In order to solve this issue, we proposed a user-friendly teleoperation user interface (UI) to monitor and control the biobot between Japan and Bangladesh without onsite operation by experts. This study applied Madagascar hissing cockroaches (MHC) as a biobot hybrid robot. A multithreading algorithm was implemented to run multiple parallel computations concurrently on the UI. Virtual network computing (VNC) was implemented on the teleoperation UI as remote communication for streaming real-time video from Japan to Bangladesh and sending remote commands from Bangladesh to Japan. In the experiments, a remote operator successfully steered the biobot to follow a predetermined path through a developed teleoperation UI with a time delay of 275 ms. The proposed interactive and intuitive UI enables a promising and reliable system for teleoperated biobots between two remote countries.
Sensor-based Facial expression recognition (FER) is an attractive research topic. Nowadays, FER is used for different application such as smart environments and healthcare solutions. The machine can learn human emotion by using FER technology. It is the primary and essential for quantitative analysis of human sentiments. FER is an image recognition problem within the broader field of computer vision. Face detection and tracking, reliable face recognition still present a considerable challenge for researchers in computer vision and pattern recognition. First, data processing and analytics are intensive and require a large number of computation resources and memory. Second, the fundamental technical limitation is its robustness in changes in the environment. Finally, illumination variation further complicates the design of robust algorithms because of changes in shadow casts. However, sensor-based FER overcomes all these limitations. Sensor technologies, especially low-power, wireless communication, high-capacity, and data processing have made substantial progress, making it possible for sensors to evolve from low-level data collection and transmission to high-level inference. This study aims to develop a stretchable sensor-based FER system. We use random forest machine learning algorithms used for training the FER model. Commercial stretchable facial expression dataset is simulated into the anaconda software. In this research, our stretch sensor FER dataset obtained around 95% accuracy for four different emotions (Neutral, Happy, Sad, and Disgust).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.