2022
DOI: 10.1371/journal.pone.0262501
|View full text |Cite
|
Sign up to set email alerts
|

Stock prediction based on bidirectional gated recurrent unit with convolutional neural network and feature selection

Abstract: With the development of recent years, the field of deep learning has made great progress. Compared with the traditional machine learning algorithm, deep learning can better find the rules in the data and achieve better fitting effect. In this paper, we propose a hybrid stock forecasting model based on Feature Selection, Convolutional Neural Network and Bidirectional Gated Recurrent Unit (FS-CNN-BGRU). Feature Selection (FS) can select the data with better performance for the results as the input data after dat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
20
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 23 publications
(20 citation statements)
references
References 37 publications
0
20
0
Order By: Relevance
“…A more comprehensive and complex set of text features is obtained for each node by concatenating the results of the forward and backward layers. This process enables a context-based comprehensive judgment [45][46][47].…”
Section: B Analysis Of the Bi-gru Modelmentioning
confidence: 99%
“…A more comprehensive and complex set of text features is obtained for each node by concatenating the results of the forward and backward layers. This process enables a context-based comprehensive judgment [45][46][47].…”
Section: B Analysis Of the Bi-gru Modelmentioning
confidence: 99%
“…Finally, the BiGRU model is exploited for the HGR procedure. An RNN is effectively utilized for handling data series from distinct regions [20]. In RNN, assume that input series x ¼ x 1 ; .…”
Section: Module Ii: Gesture Recognitionmentioning
confidence: 99%
“…The concept of a GNN first appeared in 2005, and since then, there has been a boom in deep learning, and GNN research has become quite active. In particular, since 2016, major methods have emerged, such as that of a Gated GNN and Graph Convolution Network (GCN), which is an architecture that can learn embedded representations of nodes both with and without supervision based on the features of the nodes [84][85][86][87] (Figure 3). When obtaining an embedding representation for node A on a graph with a two-layer GCN, the GCN computes the embedding representation of the node by repeated aggregation of neighboring nodes.…”
Section: Otherwisementioning
confidence: 99%
“…Since the recursive process in Step 1 is repeated, the amount of calculation is large. A gated-graph sequential neural network (GGS-NN) replaces the recursion process in Step 1 with a Gated Recurrent Unit (GRU), which is the gating mechanism in a recurrent neural network (RNN) and which has better performance on certain smaller datasets and removes the constraints of contraction mapping [86][87][88][89][90][91][92]. The GRU concept can be expressed using the following formula:…”
Section: Current Qsarmentioning
confidence: 99%