2022
DOI: 10.3390/atmos13060989
|View full text |Cite
|
Sign up to set email alerts
|

Multitask Learning Based on Improved Uncertainty Weighted Loss for Multi-Parameter Meteorological Data Prediction

Abstract: With the exponential growth in the amount of available data, traditional meteorological data processing algorithms have become overwhelmed. The application of artificial intelligence in simultaneous prediction of multi-parameter meteorological data has attracted much attention. However, existing single-task network models are generally limited by the data correlation dependence problem. In this paper, we use a priori knowledge for network design and propose a multitask model based on an asymmetric sharing mech… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 21 publications
0
0
0
Order By: Relevance
“…The particular Jena dataset points used in the abovementioned Figure 8 of Wang et al (2022) were determined using a simple algorithm. They turned out to be points in a 24hour segment starting from the 50,095 th hour of the Jena dataset.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…The particular Jena dataset points used in the abovementioned Figure 8 of Wang et al (2022) were determined using a simple algorithm. They turned out to be points in a 24hour segment starting from the 50,095 th hour of the Jena dataset.…”
Section: Discussionmentioning
confidence: 99%
“…Comparison of MIRNet's single-step prediction on the Jena dataset with previous studies on the same dataset reveals a more objective view of the performance of the model. At first glance, MIRNet outperforms all but one(Wang et al, 2022) existing single-step model. A closer look at the Wang et al model however raises a number of questions.…”
mentioning
confidence: 99%
See 1 more Smart Citation