2022
DOI: 10.1007/978-3-031-20992-5_4
|View full text |Cite
|
Sign up to set email alerts
|

Helmet Detection System for Motorcycle Riders with Explainable Artificial Intelligence Using Convolutional Neural Network and Grad-CAM

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 16 publications
0
0
0
Order By: Relevance
“…However, these models lack explainability [21,22,31,46]. Although many XDL methods have been proposed for natural image problems [47][48][49], relatively less attention has been paid to model explainability in the context of brain imaging applications [19,50]. Consequently, the lack of interpretability in the models has been a concern for radiologists and healthcare professionals that find the black-box nature of the models inadequate for their needs.…”
Section: Related Workmentioning
confidence: 99%
“…However, these models lack explainability [21,22,31,46]. Although many XDL methods have been proposed for natural image problems [47][48][49], relatively less attention has been paid to model explainability in the context of brain imaging applications [19,50]. Consequently, the lack of interpretability in the models has been a concern for radiologists and healthcare professionals that find the black-box nature of the models inadequate for their needs.…”
Section: Related Workmentioning
confidence: 99%