Hand gesture recognition technology plays an important role in human-computer interaction and in-vehicle entertainment. Under in-vehicle conditions, it is a great challenge to design gesture recognition systems due to variable driving conditions, complex backgrounds, and diversified gestures. In this paper, we propose a gesture recognition system based on frequency-modulated continuous-wave (FMCW) radar and transformer for an in-vehicle environment. Firstly, the original range-Doppler maps (RDMs), range-azimuth maps (RAMs), and range-elevation maps (REMs) of the time sequence of each gesture are obtained by radar signal processing. Then we preprocess the obtained data frames by region of interest (ROI) extraction, vibration removal algorithm, background removal algorithm, and standardization. We propose a transformer-based radar gesture recognition network named RGTNet. It fully extracts and fuses the spatial-temporal information of radar feature maps to complete the classification of various gestures. The experimental results show that our method can better complete the eight gesture classification tasks in the in-vehicle environment. The recognition accuracy is 97.56%.
Automotive millimeter-wave (MMW) radar is essential in autonomous vehicles due to its robustness in all weather conditions. Traditional commercial automotive radars are limited by their resolution, which makes the object classification task difficult. Thus, the concept of a new generation of four-dimensional (4D) imaging radar was proposed. It has high azimuth and elevation resolution and contains Doppler information to produce a high-quality point cloud. In this paper, we propose an object classification network named Radar Transformer. The algorithm takes the attention mechanism as the core and adopts the combination of vector attention and scalar attention to make full use of the spatial information, Doppler information, and reflection intensity information of the radar point cloud to realize the deep fusion of local attention features and global attention features. We generated an imaging radar classification dataset and completed manual annotation. The experimental results show that our proposed method achieved an overall classification accuracy of 94.9%, which is more suitable for processing radar point clouds than the popular deep learning frameworks and shows promising performance.
Traffic participant classification is critical in autonomous driving perception. Millimetre wave radio detection and ranging (RADAR) is a cost‐effective and robust means of performing this task in adverse traffic scenarios such as inclement weather (e.g. fog, snow, and rain) and poor lighting conditions. Compared to commercial two‐dimensional RADAR, the new generation of three‐dimensional (3D) RADAR can obtain height information about targets as well as their dense point clouds, greatly improving target classification capabilities. This study proposes a multi‐objective classification method for traffic participants based on 3D RADAR point clouds. First, a 22‐dimensional feature vector of the 3D RADAR point cloud distribution was extracted to describe the shape, discrete, Doppler, and reflection intensity features of the targets. Then, dynamic and static datasets containing five classes of targets were produced, creating a 10k frame. Extensive experiments were conducted to build machine learning classifiers. The experimental results show that the trained classifiers can achieve over 92% classification accuracy when the targets are classified into five groups and over 95% classification accuracy when the targets are classified into four groups. The proposed method can guide the design of safer and more efficient intelligent driving systems.
This paper presents a target tracking algorithm based on 4D millimeter-wave radar point cloud information for autonomous driving applications, which addresses the limitations of traditional 2 + 1D radar systems by using higher resolution target point cloud information that enables more accurate motion state estimation and target contour information. The proposed algorithm includes several steps, starting with the estimation of the ego vehicle’s velocity information using the radial velocity information of the millimeter-wave radar point cloud. Different clustering suggestions are then obtained using a density-based clustering method, and correlation regions of the targets are obtained based on these clustering suggestions. The binary Bayesian filtering method is then used to determine whether the targets are dynamic or static targets based on their distribution characteristics. For dynamic targets, Kalman filtering is used to estimate and update the state of the target using trajectory and velocity information, while for static targets, the rolling ball method is used to estimate and update the shape contour boundary of the target. Unassociated measurements are estimated for the contour and initialized for the trajectory, and unassociated trajectory targets are selectively retained and deleted. The effectiveness of the proposed method is verified using real data. Overall, the proposed target tracking algorithm based on 4D millimeter-wave radar point cloud information has the potential to improve the accuracy and reliability of target tracking in autonomous driving applications, providing more comprehensive motion state and target contour information for better decision making.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.