2021
DOI: 10.21203/rs.3.rs-631282/v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Softmax Function Based Neutrosophic Aggregation Operators and Application in Multi-Attribute Decision Making Problem

Abstract: Softmax function is a well-known generalization of the logistic function. It has been extensively used in various probabilistic classification methods such as softmax regression, linear discriminant analysis, naive bayes classifiers, and artificial neural networks. Inspired by the advantages of softmax function, we focused on this paper is to develop the softmax function based single valued neutrosophic aggregation operators. Then we have established some essential properties of aggregation operators based on … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 24 publications
0
1
0
Order By: Relevance
“…The output layer, which uses a softmax classifier, is the last layer. The softmax function compresses a K-dimensional vector of arbitrary real values into a K-dimensional probability vector where each element's value is in the range of (0,1) and the element sum is "1" [83].…”
Section: Word2vec and Cosine Similaritymentioning
confidence: 99%
“…The output layer, which uses a softmax classifier, is the last layer. The softmax function compresses a K-dimensional vector of arbitrary real values into a K-dimensional probability vector where each element's value is in the range of (0,1) and the element sum is "1" [83].…”
Section: Word2vec and Cosine Similaritymentioning
confidence: 99%