Accurate human identification using radar has a variety of potential applications, such as surveillance, access control and security checkpoints. Nevertheless, radar-based human identification has been limited to a few motion-based biometrics that are solely reliant on micro-Doppler signatures. This paper proposes for the first time the use of combined radar-based heart sound and gait signals as biometrics for human identification. The proposed methodology starts by converting the extracted biometric signatures collected from 18 subjects to images, and then an image augmentation technique is applied and the deep transfer learning is used to classify each subject. A validation accuracy of 58.7% and 96% is reported for the heart sound and gait biometrics, respectively. Next, the identification results of the two biometrics are combined using the joint probability mass function (PMF) method to report a 98% identification accuracy. To the best of our knowledge, this is the highest reported in the literature to date. Lastly, the trained networks are tested in an actual scenario while being used in an office access control platform to identify different human subjects. We report an accuracy of 76.25%.
Human identification and activity recognition (HIAR) is crucial for many applications, such as surveillance, smart homes, and assisted living. As a sensing modality, radar has many unique characteristics including privacy protection, and contactless sensing. Single classification systems have shown to be accurate, but for long-term solutions both human identification (ID) and human activity recognition (HAR) will need to be integrated in one system where it can be utilised simultaneously. In this article, a novel radar-based human tracking system is presented where three classifiers are utilised to identify the subject and his/her behaviour. For any kind of motion, the system tracks the subject and detect the type of his/her motion. Based on the detected type of motion, the three classifiers are utilised for identification and activity recognition. The classifiers are built utilising deep transfer learning where three radar datasets are established to train and validate each of the deep networks. To recognise six activities and 10 human subjects, the three classifiers, namely, HAR, Gait ID, and Heart sound ID, achieve superior performance compared to the best reported results in literature with classification accuracies of 97.6%, 100%, and 41.8% respectively. Three successful examples are presented to demonstrate the introduced concept.
This paper presents the design and development of miniature coils for wireless power and data transfer through metal. Our coil has a total size of 15 mm × 13 mm × 6 mm. Experimental results demonstrate that we can harvest 440 mW through a 1 mm-thick aluminum plate. Aluminum and stainless-steel barriers of different thicknesses were used to characterize coil performance. Using a pair of the designed coils, we have developed a through-metal communication system to successfully transfer data through a 1 mm-thick aluminum plate. A maximum data rate of 100 bps was achieved using only harvested power. To the best of our knowledge, this is the first report that demonstrates power and data transfer through aluminum using miniature coils.
Crystal x-ray imaging is frequently used in inertial confinement fusion and laser-plasma interaction applications, as it has advantages compared to pinhole imaging, such as higher signal throughput, beer achievable spatial resolution and chromatic selection. However, currently used x-ray detectors are only able to obtain a single time resolved image per crystal. e dilation aided single-line-of-sight x-ray camera described here, designed for the National Ignition Facility (NIF) combines two recent diagnostic developments, the pulse dilation principle used in the dilation x-ray imager (DIXI) and a ns-scale multi-frame camera that uses a hold-andreadout circuit for each pixel (hCMOS). is enables multiple images to be taken from a single-line-of-sight with high spatial and temporal resolution. At the moment, the instrument can record two single-line-of-sight images with spatial and temporal resolution of 35 µm and down to 35 ps, respectively, with a planned upgrade doubling the number of images to four. Here we present the dilation aided single-line-of-sight camera for the NIF, including the x-ray characterization measurements obtained at the COMET laser and the results from the initial timing shot on the NIF.
Electron tubes continue to provide the highest speeds possible for recording dynamics of hot high-energy density plasmas. Standard streak camera drive electronics and CCD readout are not compatible with the radiation environment associated with high DT fusion yield inertial confinement fusion experiments >1013 14 MeV DT neutrons or >109 n cm−2 ns−1. We describe a hardened x-ray streak camera developed for the National Ignition Facility and present preliminary results from the first experiment on which it has participated, recording the time-resolved bremsstrahlung spectrum from the core of an inertial confinement fusion implosion at more than 40× the operational neutron yield limit of the previous National Ignition Facility x-ray streak cameras.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.