2019
DOI: 10.20944/preprints201904.0091.v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Preference Neural Network

Abstract: This paper proposes a preference neural network (PNN) to address the problem of indifference preferences orders with new activation function. PNN also solves the Multi-label ranking problem, where labels may have indifference preference orders or subgroups are equally ranked. PNN follows a multi-layer feedforward architecture with fully connected neurons. Each neuron contains a novel smooth stairstep activation function based on the number of preference orders. PNN inputs represent data features and output neu… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 28 publications
0
4
0
Order By: Relevance
“…These fluctuations are not related to the gradient error in ranking, but it is the average ranking between two subgroups as each subgroup tends to increase the ranking, it updates its weights which reflect on the shared weights, which may reduce the convergence of the second group. The fluctuation is shown in the video link of convergence of two groups using toy dataset [ 43 ]. The convergence fluctuations are not noticed when we use three subgroups together, i.e., the iris-wine-stock dataset using the same hyper-parameters of two subgroups SGPNN .…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…These fluctuations are not related to the gradient error in ranking, but it is the average ranking between two subgroups as each subgroup tends to increase the ranking, it updates its weights which reflect on the shared weights, which may reduce the convergence of the second group. The fluctuation is shown in the video link of convergence of two groups using toy dataset [ 43 ]. The convergence fluctuations are not noticed when we use three subgroups together, i.e., the iris-wine-stock dataset using the same hyper-parameters of two subgroups SGPNN .…”
Section: Discussionmentioning
confidence: 99%
“…The preference neural network ( PNN ) is a simple fully connected network with a single hidden layer which provides desirable ranking performance due to the SS activation function [ 39 ]. We performed experiments on 12 benchmark label ranking datasets [ 26 ] which show that increasing the number of hidden layers does not improve the performance, but rather it has adverse effects.…”
Section: The Proposed Sgpnnmentioning
confidence: 99%
See 2 more Smart Citations