2022
DOI: 10.3390/w14172592
|View full text |Cite
|
Sign up to set email alerts
|

Water Quality Prediction Using KNN Imputer and Multilayer Perceptron

Abstract: The rapid development to accommodate population growth has a detrimental effect on water quality, which is deteriorating. Consequently, water quality prediction has emerged as a topic of great interest during the past decade. Existing water quality prediction approaches lack the desired accuracy. Moreover, the available datasets have missing values, which reduces the performance efficiency of classifiers. This study presents an automatic water quality prediction method that resolves the issue of missing values… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
25
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
3
1
1

Relationship

1
9

Authors

Journals

citations
Cited by 82 publications
(26 citation statements)
references
References 32 publications
1
25
0
Order By: Relevance
“…The imputation technique you choose is important because it may have a big influence on how well your model performs. A popular approach for imputing missing data is the KNN imputer, which may be used instead of more conventional imputation techniques like [ 59 ]. By utilizing the Euclidean distance matrix to locate the closest neighbors, it substitutes missing values.…”
Section: Methodsmentioning
confidence: 99%
“…The imputation technique you choose is important because it may have a big influence on how well your model performs. A popular approach for imputing missing data is the KNN imputer, which may be used instead of more conventional imputation techniques like [ 59 ]. By utilizing the Euclidean distance matrix to locate the closest neighbors, it substitutes missing values.…”
Section: Methodsmentioning
confidence: 99%
“…Given the gradient of the loss function, the weights can be updated as follows. where and are the new and current weights, is the learning rate, and is the gradient of the loss function calculated by an optimization algorithm, such as Adam, Stochastic Gradient Descent (SGD), RMSprop, Adagrad, Momentum, Nestrov and Accelerated Gradient 34 , 35 .…”
Section: Methodsmentioning
confidence: 99%
“…k-NN is the first choice for medical data mining. k-NN is a straightforward instance-based classifier [ 34 , 35 ]. A supervised learning model called k-NN compares new data to existing cases to determine how similar they are, then groups the new data with those cases that have the highest similarity.…”
Section: Materials and Methodsmentioning
confidence: 99%