2021
DOI: 10.1016/j.neucom.2021.04.003
|View full text |Cite
|
Sign up to set email alerts
|

Building partially understandable convolutional neural networks by differentiating class-related neural nodes

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 12 publications
0
2
0
Order By: Relevance
“…We need to observe the essence through the phenomenon, so as to explain the predicted results. As mentioned in the paper [48], the CNN model is like a black box. How to enhance the interpretability of the model is very necessary, especially in the field of rolling bearing fault diagnosis.…”
Section: Interpretability Of Lightweight 1dcnnmentioning
confidence: 99%
“…We need to observe the essence through the phenomenon, so as to explain the predicted results. As mentioned in the paper [48], the CNN model is like a black box. How to enhance the interpretability of the model is very necessary, especially in the field of rolling bearing fault diagnosis.…”
Section: Interpretability Of Lightweight 1dcnnmentioning
confidence: 99%
“…Dai et al built the business model from the perspective of business ecology. The elements of the model included target customers, business systems, strategic positioning, and partners [ 9 ]. The research on business model innovation is mainly on the innovation driving force, including the decision-making ability of the executives and the overall level of the enterprise technology.…”
Section: Review and Analysis Of Related Researchmentioning
confidence: 99%