2023
DOI: 10.1016/j.commatsci.2023.112321
|View full text |Cite
|
Sign up to set email alerts
|

A machine learning model for multi-class classification of quenched and partitioned steel microstructure type by the k-nearest neighbor algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 25 publications
0
4
0
Order By: Relevance
“…So far, commonly used ML models and algorithms in HEA design include neural networks (NNs) [31][32][33][34][35][36][37][38][39][40][41][42][43], support vector machine (SVM) [44][45][46][47][48][49][50][51][52][53][54], Gaussian process (GP) [36,[55][56][57][58][59][60][61], k-nearest neighbors (KNN) [62][63][64][65][66], and random forests (RFs) models and algorithms [67,68] etc.…”
Section: Common ML Models and Algorithms In Hea Designmentioning
confidence: 99%
“…So far, commonly used ML models and algorithms in HEA design include neural networks (NNs) [31][32][33][34][35][36][37][38][39][40][41][42][43], support vector machine (SVM) [44][45][46][47][48][49][50][51][52][53][54], Gaussian process (GP) [36,[55][56][57][58][59][60][61], k-nearest neighbors (KNN) [62][63][64][65][66], and random forests (RFs) models and algorithms [67,68] etc.…”
Section: Common ML Models and Algorithms In Hea Designmentioning
confidence: 99%
“…A Decision Tree (DT) [ 50 ], for example, is a popular algorithm that constructs a hierarchical structure comprising decision nodes and leaf nodes, enabling efficient decision-making processes. SVC (C-Support Vector Classification) [ 51 ] takes a different approach by utilizing hyperplanes to classify data into distinct categories. Logistic Regression offers a probabilistic framework for predicting binary outcomes, while Random Forest [ 52 ] leverages an ensemble to improve predictive accuracy.…”
Section: Introductionmentioning
confidence: 99%
“…Logistic Regression offers a probabilistic framework for predicting binary outcomes, while Random Forest [ 52 ] leverages an ensemble to improve predictive accuracy. KNN [ 51 ], on the other hand, makes use of proximity measurements to classify new instances, and Naïve Bayes applies Bayes’ theorem to calculate probabilities for classification purposes. Furthermore, the literature review highlights the rising prominence of XGBoost (eXtreme Gradient Boosting) [ 51 , 53 , 54 ] as a state-of-the-art static learning model.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation