2020
DOI: 10.1016/j.neucom.2020.01.025
|View full text |Cite
|
Sign up to set email alerts
|

Unimodal regularized neuron stick-breaking for ordinal classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
37
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3

Relationship

5
3

Authors

Journals

citations
Cited by 52 publications
(37 citation statements)
references
References 22 publications
0
37
0
Order By: Relevance
“…and t is a one-hot distribution with t j * = 1(or 2 , there is only one feasible optimal transport plan.…”
Section: Optimal Transport With One-hot Targetmentioning
confidence: 99%
See 2 more Smart Citations
“…and t is a one-hot distribution with t j * = 1(or 2 , there is only one feasible optimal transport plan.…”
Section: Optimal Transport With One-hot Targetmentioning
confidence: 99%
“…Conventionally, the risk minimization in deep learning is based on N -way flat softmax prediction and cross-entropy (CE) loss, where N is the number of categories. However, it can ignore the correlation of different classes and can not discriminate different kinds of misclassification [1,2].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Compared to the case of a single image [24,37,31,38,29,26], richer and complementary information can be expected in a set, because the samples are captured from multiple views [36]. However, it also poses several challenges including a) variable number of samples within a set, b) larger inner-set variability than its video-based recognition counterpart, and c) order-less data.…”
Section: Introductionmentioning
confidence: 99%
“…On the other hand, metric regression methods treat the pose as a continuous numerical value [35,28,32], although the label of pose itself is discrete. As manifested in [36,31,27], learning a regression model using discrete labels will cause over-fitting and exhibit similar or inferior performance compared with classification.…”
Section: Introductionmentioning
confidence: 99%