2022
DOI: 10.1109/access.2022.3155231
|View full text |Cite
|
Sign up to set email alerts
|

Exponential Loss Minimization for Learning Weighted Naive Bayes Classifiers

Abstract: The naive Bayesian classification method has received significant attention in the field of supervised learning. This method has an unrealistic assumption in that it views all attributes as equally important. Attribute weighting is one of the methods used to alleviate this assumption and consequently improve the performance of the naive Bayes classification. This study, with a focus on nonlinear optimization problems, proposes four attribute weighting methods by minimizing four different loss functions. The pr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0
1

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 24 publications
(10 citation statements)
references
References 25 publications
0
9
0
1
Order By: Relevance
“…Maximum likelihood estimation is used to compute σ c 2 and μ c . Apart from its naive assumption of conditional independence, Naive Bayes classification has shown good performances in many complex real-world problems [21,22].…”
Section: Each Class's Answer Assignment In Naïve Bayesmentioning
confidence: 99%
See 1 more Smart Citation
“…Maximum likelihood estimation is used to compute σ c 2 and μ c . Apart from its naive assumption of conditional independence, Naive Bayes classification has shown good performances in many complex real-world problems [21,22].…”
Section: Each Class's Answer Assignment In Naïve Bayesmentioning
confidence: 99%
“…To illustrate the efficiency of proposed DFTF algorithm, we also adopt the similar measurement to other frameworks using the following criteria: Accuracy= TP+TN TP+FN+TN+FP (21) Recall= TP TP+FN (22) Precision= TP TP+FP (23)…”
Section: Performance Evaluation Criteriamentioning
confidence: 99%
“…4) Naive Bayes (NB) algorithm [13]. It is one of the most widely used classification algorithms, with simple logic, easy implementation, and low time and space costs during the classification process.…”
Section: Comparison Algorithmmentioning
confidence: 99%
“…But determining the parameters of NBWELM is a challenging task. Attribute weighting methods are proposed to minimize different loss functions [21]. The loss function belongs to an exponential function family, which makes the optimization problem easier to solve, thereby enhancing the robustness of the naive Bayes classifier.…”
Section: Introductionmentioning
confidence: 99%