2022
DOI: 10.1016/j.compchemeng.2022.107784
|View full text |Cite
|
Sign up to set email alerts
|

Predicting octane numbers relying on principal component analysis and artificial neural network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(8 citation statements)
references
References 24 publications
0
8
0
Order By: Relevance
“…When LPCA is used for dimensionality reduction, different strategies are available to choose a priori the optimal number of eigenvectors to be retained, i.e., to discard only the redundant information and/or the noise in the dataset. The aforementioned strategies include evaluating the explained variance in each cluster (i.e., the ratio between the cumulative sum of the eigenvalues and their sum), or the concurrent evaluation of the singular values and the dimensionality of the input matrix [42,43]. Nevertheless, when LPCA is employed for clustering, there are no available methods to choose this parameter a priori other than user expertise, and the same holds for the number of clusters, k. Moreover, as also shown in Fig.…”
Section: Clustering Via Local Principal Component Analysismentioning
confidence: 99%
“…When LPCA is used for dimensionality reduction, different strategies are available to choose a priori the optimal number of eigenvectors to be retained, i.e., to discard only the redundant information and/or the noise in the dataset. The aforementioned strategies include evaluating the explained variance in each cluster (i.e., the ratio between the cumulative sum of the eigenvalues and their sum), or the concurrent evaluation of the singular values and the dimensionality of the input matrix [42,43]. Nevertheless, when LPCA is employed for clustering, there are no available methods to choose this parameter a priori other than user expertise, and the same holds for the number of clusters, k. Moreover, as also shown in Fig.…”
Section: Clustering Via Local Principal Component Analysismentioning
confidence: 99%
“…A principal component analysis (PCA) is an unsupervised learning method of feature extraction and dimensional reduction (moving p-dimensional data to a lower-dimensional m-dimensional linear subspace), retaining the original features of the data and selecting their key properties [42,[51][52][53]. It analyses a data table in which observations are described by several intercorrelated quantitative dependent variables and is widely used due to its ability to extract interpretable information by efficiently removing redundancies [54,55].…”
Section: Background 21 Principal Component Analysis (Pca)mentioning
confidence: 99%
“…In literature, it is possible to find many applications exploiting PCA to speed-up computational fluid dynamics (CFD) simulations of reacting systems by reducing the number of the differential equations to be solved for combustion or atmospheric re-entry applications [7][8][9][10]. In addition, the algorithm and its local formulation were recently validated for clustering in chemical kinetics applications [11,12], for data analysis of Direct Numerical Simulation (DNS) of reacting jets [13,14], and for feature extraction and selection to enhance the classification and regression accuracy of Artificial Neural Networks (ANN) [15,16].…”
Section: Introductionmentioning
confidence: 99%