2020
DOI: 10.3390/s20236858
|View full text |Cite
|
Sign up to set email alerts
|

VODCA: Verification of Diagnosis Using CAM-Based Approach for Explainable Process Monitoring

Abstract: Process monitoring at industrial sites contributes to system stability by detecting and diagnosing unexpected changes in a system. Today, as the infrastructure of industrial sites is advancing because of the development of communication technology, vast amounts of data are generated, and the importance of a way to effectively monitor such data in order to diagnose a system is increasing daily. Because a method based on a deep neural network can effectively extract information from a large amount of data, metho… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(2 citation statements)
references
References 32 publications
0
2
0
Order By: Relevance
“…The explosive increase in the use of deep neural network (DNN)-based models for applications across domains has resulted in a very strong need to find ways to interpret the decisions made by these models (Gade et al 2020;Tang et al 2021;Yap et al 2021;Oviedo et al 2022;Oh and Jeong 2020). Interpretability is an important aspect of responsible and trustworthy AI, and model explanation methods (also known as attribution methods) are an important aspect of the community's efforts towards explaining and debugging real-world AI/ML systems.…”
Section: Introductionmentioning
confidence: 99%
“…The explosive increase in the use of deep neural network (DNN)-based models for applications across domains has resulted in a very strong need to find ways to interpret the decisions made by these models (Gade et al 2020;Tang et al 2021;Yap et al 2021;Oviedo et al 2022;Oh and Jeong 2020). Interpretability is an important aspect of responsible and trustworthy AI, and model explanation methods (also known as attribution methods) are an important aspect of the community's efforts towards explaining and debugging real-world AI/ML systems.…”
Section: Introductionmentioning
confidence: 99%
“…It attempts to cover the issues related to key enabling technologies for smart manufacturing such as product quality inspection based on deep learning, remaining useful life prediction for predictive maintenance based on deep learning, Machine Vision Systems, intelligent recommender system, Intelligent Decision-Making of Scheduling for Dynamic Permutation Flowshop via Deep Reinforcement Learning, Real-Time and Explainable Process Monitoring, Intelligence-Driven Decision Support System. These contributions represent an advance in the state-of-the-art of key enabling technologies for smart manufacturing [ 11 , 12 , 13 , 14 , 15 , 16 , 17 , 18 , 19 , 20 , 21 , 22 , 23 , 24 , 25 , 26 , 27 , 28 , 29 ]. The richness and diverseness of the papers submitted to this Special Issue confirm the importance of applications of AI in Smart Manufacturing.…”
mentioning
confidence: 99%