2023
DOI: 10.3390/en16052419
|View full text |Cite
|
Sign up to set email alerts
|

PMV Dimension Reduction Utilizing Feature Selection Method: Comparison Study on Machine Learning Models

Abstract: Since P.O. Fanger proposed PMV, it has been the most widely used index to estimate thermal comfort. However, in some cases, it is challenging to measure all six parameters within indoor spaces, which are essential for PMV estimation; a couple of parameters, such as Clo or Met, tend to show a large deviation in accuracy. For these reasons, several studies have suggested methods to estimate PMV but their accuracies were significantly compromised. In this vein, this study proposed a way to reduce the dimensions o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 26 publications
0
1
0
Order By: Relevance
“…Subsequently, the normalized data were downscaled using principal component analysis. The downscaled data simplified the computation and visualization to a certain extent, and the noise and redundant information in the original data could be removed [41,42]. A line graph of the cumulative variance and the number of principal components was plotted (Figure 5), and the number of principal components corresponding to the cumulative contribution rate of 0.95 was used as the optimal number of principal components.…”
Section: Feature Selectionmentioning
confidence: 99%
“…Subsequently, the normalized data were downscaled using principal component analysis. The downscaled data simplified the computation and visualization to a certain extent, and the noise and redundant information in the original data could be removed [41,42]. A line graph of the cumulative variance and the number of principal components was plotted (Figure 5), and the number of principal components corresponding to the cumulative contribution rate of 0.95 was used as the optimal number of principal components.…”
Section: Feature Selectionmentioning
confidence: 99%