2021
DOI: 10.48550/arxiv.2111.08851
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Deep Neural Networks for Rank-Consistent Ordinal Regression Based On Conditional Probabilities

Abstract: In recent times, deep neural networks achieved outstanding predictive performance on various classification and pattern recognition tasks. However, many real-world prediction problems have ordinal response variables, and this ordering information is ignored by conventional classification losses such as the multi-category cross-entropy. Ordinal regression methods for deep neural networks address this. One such method is the CORAL method, which is based on an earlier binary label extension framework and achieves… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(13 citation statements)
references
References 16 publications
0
13
0
Order By: Relevance
“…Hyperparameters were optimized similarly to the neural network and yielded a tree depth of 1 (deeper trees generalized worse, likely due to overfitting). One reason for the better performance of the neural network might be the various techniques employed to deal with overfitting such as early stopping, 46 dropout, 47 and rank-consistent ordinal regression 25,26 (for details, see Section 4), which is a major problem in the presented setting due to the small number of positive examples.…”
Section: Exploration Of Other Machine Learning Modelsmentioning
confidence: 99%
See 2 more Smart Citations
“…Hyperparameters were optimized similarly to the neural network and yielded a tree depth of 1 (deeper trees generalized worse, likely due to overfitting). One reason for the better performance of the neural network might be the various techniques employed to deal with overfitting such as early stopping, 46 dropout, 47 and rank-consistent ordinal regression 25,26 (for details, see Section 4), which is a major problem in the presented setting due to the small number of positive examples.…”
Section: Exploration Of Other Machine Learning Modelsmentioning
confidence: 99%
“…The neural network was implemented in PyTorch 50 using the CORN loss for rank-consistent ordinal regression. 25,26 As an optimizer we chose AdamW 51 and trained the network using early stopping 46 and cyclical learning rates. 52 For the cyclical learning rate scheduling, we chose a base learning rate of 0.0001, a maximum learning rate of 0.01, the "triangular2" cycling mode, and the number of iterations of one cycle was set to four times the batch size.…”
Section: Neural Network Architecture and Trainingmentioning
confidence: 99%
See 1 more Smart Citation
“…They reported that the CORAL framework provided both rank consistency and superior results compared to the previous approaches. Shi et al [24] proposed Conditional Ordinal regression for Neural Network (CORN) framework to relax the constraint on the penultimate layer of the CORAL framework to increase neural network's capacity by introducing conditional probabilities. The authors reported that the CORN approach performs better than previous methods.…”
Section: Related Workmentioning
confidence: 99%
“…In addition, we have experimented with CORN framework, which is designed for the ordinal regression tasks. CORN framework has recently been introduced by Shi et al [24] and it is an improvement on previous ordinal regression approaches that transform the task into a series of binary classification sub-problems [14,4]. Experimental results show that the CORN approach and its derivatives (OR-NN [14] and CORAL [4]), have superior results compared to other ordinal regression approaches.…”
Section: Training Detailsmentioning
confidence: 99%