Pupil center recognition and location is an essential branch of ergonomics. It can be applied to emotion analysis and attention judgment. How to get the position of the pupil center from eye photos is the core of this field. Previous studies provided a helpful method, using scale-invariant feature transform (SIFT) to extract relevant features and combine them with the K-Nearest Neighbor (KNN) classifier. However, this method’s accuracy is not satisfying, and under some conditions, it will be position drift and other problems. We put forward a new idea to solve it by using Oriented FAST and Rotated BRIEF (ORB) features and Random Forest (RF) classifies. It is proved by experiment that our method improves the robustness of localization and the use of isophotes yields low computational cost, allowing for real-time processing. Meanwhile, we found that the ORB and RF are nearly as good, yielding an accuracy of 92.88% (BioID database).
Situational awareness is the ability of pilots to master flight status, which is of great significance to aviation flight safety and flight effect. According to the information processing model, the pilot’s main steps of processing information are feeling, perception and execution. There are many problems in situation awareness analysis guided by visual gaze, such as large analysis deviation and high delay due to various influencing factors and complex characteristics. In order to solve this problem, this paper proposes a situation awareness assessment method based on artificial intelligence neural network and integrating visual gaze and flight control. First, this paper carries out simulated flight training experiments for flight cadets, and collects the data of eye movement, line of sight tracking, flight control and flight parameters of pilot cadets. Then, aiming at the flight subjects, a situation awareness analysis method based on events is established, and the situation awareness state in the experiment is evaluated and analyzed through the flight parameter data. Then, the visual gaze and flight control data are sliced in the unit of situational awareness events, and the data set is constructed. Finally, this paper designs a multi-channel sequence data classification and analysis model based on transformer, in which the situation awareness characteristics of visual gaze and operation behavior are analyzed through the attention mechanism. The experimental results show that the accuracy of situation awareness classification of the designed neural network model to the experimental data set is 96%, and can classify and evaluate the pilot’s situation awareness state in 5[Formula: see text]s.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.