2019
DOI: 10.1016/j.nima.2019.162742
|View full text |Cite
|
Sign up to set email alerts
|

A supervised machine learning approach using naive Gaussian Bayes classification for shape-sensitive detector pulse discrimination in positron annihilation lifetime spectroscopy (PALS)

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 18 publications
(7 citation statements)
references
References 26 publications
0
7
0
Order By: Relevance
“… Algorithm 1 GVFS method Input: dataset with label Output: feature sorted dataset with label 1: = normalize ( ); 2: Dividing the samples set into the training dataset and test dataset; 3: Calculate the for each feature on each category of the training dataset [ 33 ]. 4: for i = 1:1:sample_number 5: Calculate the Gaussian probability for each feature on each category [ 33 ]. 6: The category with the largest GPDF is taken as the category for each feature.
…”
Section: Proposed Feature Selection Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“… Algorithm 1 GVFS method Input: dataset with label Output: feature sorted dataset with label 1: = normalize ( ); 2: Dividing the samples set into the training dataset and test dataset; 3: Calculate the for each feature on each category of the training dataset [ 33 ]. 4: for i = 1:1:sample_number 5: Calculate the Gaussian probability for each feature on each category [ 33 ]. 6: The category with the largest GPDF is taken as the category for each feature.
…”
Section: Proposed Feature Selection Methodsmentioning
confidence: 99%
“…This study aims at constructing an effective and feasible fault diagnosis model for permanent magnet DC motors (PMDCMs) based on Gaussian naive Bayes (GNB) [ 33 ], -nearest neighbor algorithm ( -NN) [ 34 ], and the support vector machine (SVM) [ 35 ] by utilizing the filter feature selection methods.…”
Section: Experiments Settingmentioning
confidence: 99%
“…It is based on the Bayes' theorem and assumes that f features are mutually independent. In practice, the independency assumption may be violated, but it significantly simplifies the classification task, without a dramatic reduction of prediction accuracy and robustness [16,32].…”
Section: Basics Of Naïve Bayesmentioning
confidence: 99%
“…Chen et al [27] similarly applied the method that proposed two approaches for exploring the preferred prior GNB settings considering the individual impacts of the predictors. Shown in equation 2 is the concept of Naïve Bayes [28].…”
Section:   mentioning
confidence: 99%