2021
DOI: 10.3390/e23121621
|View full text |Cite
|
Sign up to set email alerts
|

Real-World Data Difficulty Estimation with the Use of Entropy

Abstract: In the era of the Internet of Things and big data, we are faced with the management of a flood of information. The complexity and amount of data presented to the decision-maker are enormous, and existing methods often fail to derive nonredundant information quickly. Thus, the selection of the most satisfactory set of solutions is often a struggle. This article investigates the possibilities of using the entropy measure as an indicator of data difficulty. To do so, we focus on real-world data covering various f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2
2

Relationship

0
10

Authors

Journals

citations
Cited by 17 publications
(6 citation statements)
references
References 58 publications
0
6
0
Order By: Relevance
“…Fake news combating needs to be two-fold. On the one hand, there are several technical solutions to detect fake news on the web, such as machine learning and artificial intelligence [ 48 ]. On the other hand, internet users need to be more aware of fake news existing and possess some basic knowledge about fake news recognition [ 49 ].…”
Section: Discussionmentioning
confidence: 99%
“…Fake news combating needs to be two-fold. On the one hand, there are several technical solutions to detect fake news on the web, such as machine learning and artificial intelligence [ 48 ]. On the other hand, internet users need to be more aware of fake news existing and possess some basic knowledge about fake news recognition [ 49 ].…”
Section: Discussionmentioning
confidence: 99%
“…Shannon entropy has been proposed as a metric for use with respect to big data; 30 with the availability of and access to the appropriate data, this evaluation of diagnostic quality could be applied to entire hospitals or health networks and not just with respect to one particular pathology, but with regard to any diseased state (and potentially across disease states) as long as the core metrics of true and false positives and negatives, respectively, are available for analysis, opening the door for the evaluation of diagnostic quality on a macro scale utilizing big data. Just as entropy removal is able to be used to evaluate groups in the aforementioned example, this concept could be used to assess entire departments and hospitals across networks as a metric of performance for healthcare innovation and quality.…”
Section: Discussionmentioning
confidence: 99%
“…Z. Wang & Goh, 2022). As a loss function, cross-entropy is extensively employed in ML (Juszczuk et al, 2021).Each example has a known class label with a probability of 1.0, whereas all other labels have a probability of 0.0 in classification (Ho & Wookey, 2019). In this case, the model determines the probability that a given example corresponds to each class label (Singh, 2013)..Cross-entropy can then be used to calculate the difference between two probability distributions.…”
Section: Overfitting and Underfitting In ML Modelsmentioning
confidence: 99%