2023
DOI: 10.1109/tits.2022.3217342
|View full text |Cite
|
Sign up to set email alerts
|

Toward Extremely Lightweight Distracted Driver Recognition With Distillation-Based Neural Architecture Search and Knowledge Transfer

Abstract: The number of traffic accidents has been continuously increasing in recent years worldwide. Many accidents are caused by distracted drivers, who take their attention away from driving. Motivated by the success of Convolutional Neural Networks (CNNs) in computer vision, many researchers developed CNN-based algorithms to recognize distracted driving from a dashcam and warn the driver against unsafe behaviors. However, current models have too many parameters, which is unfeasible for vehicle-mounted computing. Thi… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 23 publications
(4 citation statements)
references
References 62 publications
0
4
0
Order By: Relevance
“…Lightweight CNN has received increasing attention due to fewer weighting parameters and lower computational requirements. The key points to achieve CNN lightweight are to reduce the number of parameters and computational complexity of the model, and the common methods include using depthwise separable convolution instead of regular convolution [44], model pruning [45], and knowledge distillation [46,47]. Model pruning usually requires fine-tuning the model, which is relatively complex and sometimes even requires re-training the entire model.…”
Section: Lightweight Cnnmentioning
confidence: 99%
“…Lightweight CNN has received increasing attention due to fewer weighting parameters and lower computational requirements. The key points to achieve CNN lightweight are to reduce the number of parameters and computational complexity of the model, and the common methods include using depthwise separable convolution instead of regular convolution [44], model pruning [45], and knowledge distillation [46,47]. Model pruning usually requires fine-tuning the model, which is relatively complex and sometimes even requires re-training the entire model.…”
Section: Lightweight Cnnmentioning
confidence: 99%
“…In other studies, the number of model parameters has been reduced by optimizing the structure of deep learning models [26,27]. As knowledge distillation has become an important lightweighting tool for deep learning models, it has been applied to the field of driver distraction monitoring [28,29]. Liu, et al [29] reduced the number of parameters of a deep learning model to 0.42M based on the knowledge distillation method, which facilitated the in-vehicle application of the driving behavior detection model.…”
Section: B Deep Learning Feature Extraction and Classificationmentioning
confidence: 99%
“…As knowledge distillation has become an important lightweighting tool for deep learning models, it has been applied to the field of driver distraction monitoring [28,29]. Liu, et al [29] reduced the number of parameters of a deep learning model to 0.42M based on the knowledge distillation method, which facilitated the in-vehicle application of the driving behavior detection model. However, an important drawback of knowledge distillation is the requirement of a very large training set for the teacher model to extract useful knowledge to guide the student model.…”
Section: B Deep Learning Feature Extraction and Classificationmentioning
confidence: 99%
“…al [48] proposed a fine-grained detection of driver distraction, incorporating multisource data, but their NAS-generated CNN has a large volume (2.7M Params) and does not adequately cater to the specific needs of DDD. Fourth, Liu et al [49] introduced a teacherstudent model with knowledge distillation for the same task. As far as we know, this work is the sole endeavor in using benchmark public datasets and NAS for DDD, making it the most relevant point of comparison for our research.…”
Section: Nas For Driver Distraction Detectionmentioning
confidence: 99%