Radio-frequency based non-cooperative monitoring of humans has numerous applications ranging from law enforcement to ubiquitous sensing applications such as ambient assisted living and bio-medical applications for non-intrusively monitoring patients. Large training datasets, almost unlimited memory capacity, and ever-increasing processing speeds of computers could drive forward the data-driven deep-learningfocused research in the above applications. However, generating and labeling large volumes of high-quality, diverse radar datasets is an onerous task. Furthermore, unlike the fields of vision and image processing, the radar community has limited access to databases that contain large volumes of experimental data. Therefore, in this article, we present an open-source motion capture data-driven simulation tool, SimHumalator, that can generate large volumes of human micro-Doppler radar data in passive WiFi scenarios. The simulator integrates IEEE 802.11 WiFi standard(IEEE 802.11g, n, and ad) compliant transmissions with the human animation data to generate the micro-Doppler features that incorporate the diversity of human motion characteristics and the sensor parameters. The simulated signatures have been validated with experimental data gathered using an in-house-built hardware prototype. This article describes simulation methodology in detail and provides case studies on the feasibility of using simulated micro-Doppler spectrograms for data augmentation tasks.
This paper presents a comprehensive dataset intended to evaluate passive Human Activity Recognition (HAR) and localization techniques with measurements obtained from synchronized Radio-Frequency (RF) devices and vision-based sensors. The dataset consists of RF data including Channel State Information (CSI) extracted from a WiFi Network Interface Card (NIC), Passive WiFi Radar (PWR) built upon a Software Defined Radio (SDR) platform, and Ultra-Wideband (UWB) signals acquired via commercial off-the-shelf hardware. It also consists of vision/Infra-red based data acquired from Kinect sensors. Approximately 8 hours of annotated measurements are provided, which are collected across two rooms from 6 participants performing 6 daily activities. This dataset can be exploited to advance WiFi and vision-based HAR, for example, using pattern recognition, skeletal representation, deep learning algorithms or other novel approaches to accurately recognize human activities. Furthermore, it can potentially be used to passively track a human in an indoor environment. Such datasets are key tools required for the development of new algorithms and methods in the context of smart homes, elderly care, and surveillance applications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.