Search citation statements
Paper Sections
Citation Types
Year Published
Publication Types
Relationship
Authors
Journals
<div class="section abstract"><div class="htmlview paragraph">Investigating human driver behavior enhances the acceptance of the autonomous driving and increases road safety in heterogeneous environments with human-operated and autonomous vehicles. The previously established driver fingerprint model, focuses on the classification of driving styles based on CAN bus signals. However, driving styles are inherently complex and influenced by multiple factors, including changing driving environments and driver states. To comprehensively create a driver profile, an in-car measurement system based on the Driver-Driven vehicle-Driving environment (3D) framework is developed. The measurement system records emotional and physiological signals from the driver, including the ECG signal and heart rate. A Raspberry Pi camera is utilized on the dashboard to capture the driver's facial expressions and a trained convolutional neural network (CNN) recognizes emotion. To conduct unobtrusive ECG measurements, an ECG sensor is integrated into the steering wheel. Additionally, the system accesses CAN bus signals from the vehicle to assess the driver’s driving style, extracting signals related to longitudinal and lateral control behavior from the Drive-CAN (A-CAN). Recognizing that variables from the driving environment can influence driving style, such as traffic signs and road conditions, a windshield-mounted webcam is integrated into the measurement system. This setup enables real-time detection of common traffic signs and assessment of road conditions, distinguishing between dry, wet, or icy road surfaces. Augmenting of the image data from camera, signals from in-car ADAS-sensors, such as the distance measured by the front radar in relation to neighboring vehicles, are integrated for a comprehensive analysis of driving style. The established measurement system is presently implemented in a test vehicle, poised to investigate the interplay between the 3D-parameters, with a focus on driving style of human driver.</div></div>
<div class="section abstract"><div class="htmlview paragraph">Investigating human driver behavior enhances the acceptance of the autonomous driving and increases road safety in heterogeneous environments with human-operated and autonomous vehicles. The previously established driver fingerprint model, focuses on the classification of driving styles based on CAN bus signals. However, driving styles are inherently complex and influenced by multiple factors, including changing driving environments and driver states. To comprehensively create a driver profile, an in-car measurement system based on the Driver-Driven vehicle-Driving environment (3D) framework is developed. The measurement system records emotional and physiological signals from the driver, including the ECG signal and heart rate. A Raspberry Pi camera is utilized on the dashboard to capture the driver's facial expressions and a trained convolutional neural network (CNN) recognizes emotion. To conduct unobtrusive ECG measurements, an ECG sensor is integrated into the steering wheel. Additionally, the system accesses CAN bus signals from the vehicle to assess the driver’s driving style, extracting signals related to longitudinal and lateral control behavior from the Drive-CAN (A-CAN). Recognizing that variables from the driving environment can influence driving style, such as traffic signs and road conditions, a windshield-mounted webcam is integrated into the measurement system. This setup enables real-time detection of common traffic signs and assessment of road conditions, distinguishing between dry, wet, or icy road surfaces. Augmenting of the image data from camera, signals from in-car ADAS-sensors, such as the distance measured by the front radar in relation to neighboring vehicles, are integrated for a comprehensive analysis of driving style. The established measurement system is presently implemented in a test vehicle, poised to investigate the interplay between the 3D-parameters, with a focus on driving style of human driver.</div></div>
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.