2023
DOI: 10.3390/su16010019
|View full text |Cite
|
Sign up to set email alerts
|

Long-Term Forecasting of Air Pollution Particulate Matter (PM2.5) and Analysis of Influencing Factors

Yuyi Zhang,
Qiushi Sun,
Jing Liu
et al.

Abstract: Long-term forecasting and analysis of PM2.5, a significant air pollution source, is vital for environmental governance and sustainable development. We evaluated 10 machine learning and deep learning models using PM2.5 concentration data along with environmental variables. Employing explainable AI (XAI) technology facilitated explainability and formed the basis for factor analysis. At a 30-day forecasting horizon, ensemble learning surpassed deep learning in performance, with CatBoost emerging as the top-perfor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 42 publications
0
2
0
Order By: Relevance
“…From Figure 8, an increase in the predictive value of Fmax is associated with an increase in C4 and C1, which is consistent with previous studies. The high Fmax values of C3 and C4 positively correlate with SCOD, especially the high Fmax values of component C4, which shows long right tails in the summary plots [78]. However, the four models present different influence trends between C2 and Fmax.…”
Section: Sensitivity Analysis Of Fluorescent Componentsmentioning
confidence: 92%
“…From Figure 8, an increase in the predictive value of Fmax is associated with an increase in C4 and C1, which is consistent with previous studies. The high Fmax values of C3 and C4 positively correlate with SCOD, especially the high Fmax values of component C4, which shows long right tails in the summary plots [78]. However, the four models present different influence trends between C2 and Fmax.…”
Section: Sensitivity Analysis Of Fluorescent Componentsmentioning
confidence: 92%
“…RNNs add a memory boost to neural networks, making them way smarter when dealing with sequences and patterns. In addition, to avoid the vanishing gradient predicament inherent in ordinary RNNs, long short-term memory (LSTM) [32,33] and gated recurrent unit (GRU) [34] were invented consecutively and became the most popular RNN architectures. RNNs are intrinsically effective for time-series forecasting tasks, including natural language processing (NLP), recommendation system, and stock prediction.…”
Section: Introductionmentioning
confidence: 99%