2020
DOI: 10.1007/s13243-020-00093-9
|View full text |Cite
|
Sign up to set email alerts
|

Achieving remanufacturing inspection using deep learning

Abstract: Deep learning has emerged as a state-of-the-art learning technique across a wide range of applications, including image recognition, object detection and localisation, natural language processing, prediction and forecasting systems. With significant applicability, deep learning could be used in new and broader areas of applications, including remanufacturing. Remanufacturing is a process of taking used products through disassembly, inspection, cleaning, reconditioning, reassembly and testing to ascertain that … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
76
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 112 publications
(117 citation statements)
references
References 49 publications
0
76
0
Order By: Relevance
“…The input training image data in the DC-GAN [ 23 ] model is normalized in the range [−1,1] before giving to the discriminator. Hence, in the last Deconv layer of the generator, tanh activation [ 31 ] is utilized to set the generated data range to [−1,1].…”
Section: Methodsmentioning
confidence: 99%
“…The input training image data in the DC-GAN [ 23 ] model is normalized in the range [−1,1] before giving to the discriminator. Hence, in the last Deconv layer of the generator, tanh activation [ 31 ] is utilized to set the generated data range to [−1,1].…”
Section: Methodsmentioning
confidence: 99%
“…The capacity is appended to every neuron in the system and decides if it ought to be actuated or not, founded on whether every neuron’s input is significant for the model’s expected output. In this investigation, we utilized two individual activation functions, for example, Sigmoid [26] and ReLU [23] activation function. We utilized Sigmoid in Base CNN model and ReLU in rest of different models as initiation work.…”
Section: Methodsmentioning
confidence: 99%
“…Sigmoid: The sigmoid activation function is here and there alluded to as the strategic capacity or crushing capacity in some literatures [26]. The Sigmoid is a non-direct enactment work utilized for the most part in feedforward neural systems.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The hidden layers in the CNN model used a Rectified Linear Unit (ReLU) [44] activation function whereas the sequence based networks, LSTM, BiLSTM and GRU, used a Hyperbolic Tangent [45] activation function for the hidden layers. The final output fully connected layer for all the deep learning models used a Softmax activation function.…”
Section: Methodsmentioning
confidence: 99%