2022
DOI: 10.1109/access.2022.3170905
|View full text |Cite
|
Sign up to set email alerts
|

A New Multipredictor Ensemble Decision Framework Based on Deep Reinforcement Learning for Regional GDP Prediction

Abstract: Gross domestic product (GDP) can effectively reflect the situation of economic development and resource allocation in different regions. The high-precision GDP prediction technology lays a foundation for the sustainable development of regional resources and the proposal of economic management policies.To build an accurate GDP prediction model, this paper proposed a new multi-predictor ensemble decision framework based on deep reinforcement learning. Overall modeling consists of the following steps: Firstly, GR… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 67 publications
0
2
0
Order By: Relevance
“…In this method, predictions from GRU, temporal convolutional network, and deep belief network models were taken as input to train three GDP prediction models, and the deep Q network algorithm was used to optimize the integration weight coefficients. This proposed method outperformed 3 of 17 18 competing methods in evaluation experiments, achieving MAPE values below 4.2% in all tests [15].…”
Section: Introductionmentioning
confidence: 82%
“…In this method, predictions from GRU, temporal convolutional network, and deep belief network models were taken as input to train three GDP prediction models, and the deep Q network algorithm was used to optimize the integration weight coefficients. This proposed method outperformed 3 of 17 18 competing methods in evaluation experiments, achieving MAPE values below 4.2% in all tests [15].…”
Section: Introductionmentioning
confidence: 82%
“…ERL can be divided into parallel ERL and sequential ERL according to the relationship between base learners in ERL. Figure 6 and Figure 7 give schematic diagrams of these two long short-term memory network, gated recurrent unit network Goyal et al [46] convolution neural network, gated recursive unit Liu et al [54] long short-term memory network, deep belief network, echo state network Perepu et al [55] a linear regression model, long short-term memory model, artificial neural network, random forest Liu et al [56] graph convolutional network, long short-term memory networks, gated recursive unit Saadallah et al [50] autoregressive integrated moving average, exponential smoothing, gradient boosting machines, gaussian processes, support vector regression, random forest, projection pursuit regression, MARS, principal component regression, decision tree regression, partial least squares regression, multilayer perceptron, long short-term memory network (LSTM), Bi-LSTM: Bidirectional LSTM, CNNbased LSTM, convolutional LSTM Daniel L. Elliott and Charles Anderson [57] convolution neural network, gated recursive unit, artificial neural network Shang et al [30] gated recursive unit, graph convolutional network, graph attention network Tan et al [31] graph attention network, long short-term memory networks, temporal convolutional network Li et al [58] gated recurrent unit, deep belief network, temporal convolutional network Zijie Cao and Hui Liu [59] temporal convolutional network, Bidirectional long short-term memory network, kernel extreme learning machine Birman et al [60] machine learning models, artificial neural network Li et al [51] naive bayes, support vector machine with stochastic gradient descent, FastText, Bi-directional long short-term memory Sharma et al [52] support vector regressor (SVR), eXtreme gradient boosting (XGBoost), random Forest (RF), artificial neural network (ANN), long short-term memory (LSTM), convolution neural network (CNN), CNN-LSTM, CNN-XGB, CNN-SVR, and CNN-RF Shi Yin and Hui Liu [61] group method of data handling, echo state network, extreme learning machine Yu et al [29] graph attention network, gated recursive unit, temporal convolutional network , that try to construct the ERL method in the sequential framework [64,65].…”
Section: Combination Of Modelsmentioning
confidence: 99%
“…This system architecture allows the ensemble models in ERL to handle the same or different tasks. [95] fuel economy improvement Q-learning 2021 Carta et al [49] stock market forecasting deep Q-Network 2022 Li et al [58] regional GDP prediction deep Q-Network 2022 Zijie Cao and Hui Liu [59] carbon price forecasting Q-learning 2022 Németh, Marcell and Szűcs, Gábor [70] algorithmic trading proximal policy optimization, advantage actorcritic, deep deterministic policy gradient…”
Section: Internet Of Things and Cloud Computing Areamentioning
confidence: 99%
“…AI models can learn from past data to predict future trends and changes in regional economies. For instance, deep learning-based sequence-to-sequence models have been used for regional economic prediction [30], and reinforcement learning frameworks have been employed for regional GDP prediction [10]. These predictive models assist in forecasting economic conditions, informing policy decisions, and promoting proactive economic planning and management.…”
Section: Big Data Processing and Analysismentioning
confidence: 99%
“…In recent years, several studies have explored the use of AI methods for regional economic analysis. For example, econometric and machine learning methods have been used to understand the impact of higher education systems on regional economic development [8], deep learning models have been developed to assess regional economic growth factors [9], and reinforcement learning frameworks have been proposed for regional GDP prediction [10], while multi-graph convolutional network was used for recional economy prediction [11]. Additionally, the role of emerging technologies, such as 5G and the Internet of Things (IoT), in regional economic development has been investigated in the context of AI [12,13].…”
Section: Introductionmentioning
confidence: 99%