2018
DOI: 10.1016/j.jocs.2018.07.003
|View full text |Cite
|
Sign up to set email alerts
|

Multiple sclerosis identification by convolutional neural network with dropout and parametric ReLU

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
91
0
2

Year Published

2018
2018
2021
2021

Publication Types

Select...
9
1

Relationship

1
9

Authors

Journals

citations
Cited by 188 publications
(93 citation statements)
references
References 38 publications
0
91
0
2
Order By: Relevance
“…The values of the mask were then determined to minimize the loss through iterative learning. In this study, as an activation function, the rectified linear unit (ReLU) function [43][44][45][46] rather than the Sigmoid function is used as shown in Figure 7. That is because a vanishing gradient (in which a gradient converges to zero) occurs if the Sigmoid function is used [47][48][49].…”
Section: Detection Of Target Regionmentioning
confidence: 99%
“…The values of the mask were then determined to minimize the loss through iterative learning. In this study, as an activation function, the rectified linear unit (ReLU) function [43][44][45][46] rather than the Sigmoid function is used as shown in Figure 7. That is because a vanishing gradient (in which a gradient converges to zero) occurs if the Sigmoid function is used [47][48][49].…”
Section: Detection Of Target Regionmentioning
confidence: 99%
“…Andere bereits bearbeitete Fragestellungen für Radiomics-Arbeiten waren die Differenzierung von MS und Erkrankungen aus dem Neuromyelitis-Optica-Spektrum [28][29][30] und die Abgrenzung von MS-Patienten von gesunden Kontrollprobanden. Zum letztgenannten Thema existieren auch auf Deep Learning beruhende Studien [31][32][33]. Eitel et al [34] untersuchten hierbei auch, welche Merkmale der Algorithmus zur Klassifikation heranzieht, und konnten so zeigen, dass neben den typischen Läsionen in geringerem Maß auch normal erscheinende Areale, wie z.…”
Section: Integration Klinischer Datenunclassified
“…After training the network, the network will be created perfectly, but just for the training set. Dropout is a technique that addresses over-fitting [37,38]. As shown in Figure 4, some units are selected randomly and their incoming and outgoing connections are discarded from the network.…”
Section: Dropout Layermentioning
confidence: 99%