2019
DOI: 10.1109/access.2019.2946264
|View full text |Cite
|
Sign up to set email alerts
|

MPCE: A Maximum Probability Based Cross Entropy Loss Function for Neural Network Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
25
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
5

Relationship

0
10

Authors

Journals

citations
Cited by 69 publications
(25 citation statements)
references
References 38 publications
0
25
0
Order By: Relevance
“…The following equations are used to express the softmax function: 38 where z i represents the phase i corresponding to the label value, and p i denotes the probability of this phase. To adapt the softmax classifier, the loss function use softmax cross-entropy loss function: 39 where y i and p i represent the label’s true value and the probability of the phase calculated by Eq. ( 1 ), respectively.…”
Section: Resultsmentioning
confidence: 99%
“…The following equations are used to express the softmax function: 38 where z i represents the phase i corresponding to the label value, and p i denotes the probability of this phase. To adapt the softmax classifier, the loss function use softmax cross-entropy loss function: 39 where y i and p i represent the label’s true value and the probability of the phase calculated by Eq. ( 1 ), respectively.…”
Section: Resultsmentioning
confidence: 99%
“…The L CE is used to describe the distance of probability distributions between segmented venous thrombus and ground truth. Employing L CE can segment high intensity and large area thrombus clearly [ 16 ]. L SD is used here to solve the data imbalanced exists in the whole lower extremity MR images because the size of some venous thrombus is much smaller than background [ 17 ].…”
Section: Methodsmentioning
confidence: 99%
“…The acceptance and rejection thresholds were 0.95 and 0.05, respectively, corresponding to the standard classification at the 95% confidence level. As an error function, cross-entropy (CE) was applied (Equation ( 2)) due to the fact that CE implementation allows interpreting output values as the probabilities of the object group membership [80].…”
Section: Artificial Neural Network (Anns)mentioning
confidence: 99%