2022
DOI: 10.1155/2022/4725639
|View full text |Cite|
|
Sign up to set email alerts
|

Evolving Long Short-Term Memory Network-Based Text Classification

Abstract: Recently, long short-term memory (LSTM) networks are extensively utilized for text classification. Compared to feed-forward neural networks, it has feedback connections, and thus, it has the ability to learn long-term dependencies. However, the LSTM networks suffer from the parameter tuning problem. Generally, initial and control parameters of LSTM are selected on a trial and error basis. Therefore, in this paper, an evolving LSTM (ELSTM) network is proposed. A multiobjective genetic algorithm (MOGA) is used t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 14 publications
(9 citation statements)
references
References 38 publications
0
9
0
Order By: Relevance
“…In addition, there is a problem of gradient disappearance during backpropagation, mainly because the value of the gradient update neural network weight changes little, resulting in an inability to learn more text series information. To eliminate this issue, the long short-term memory (LSTM) [27] and GRU [28] text classification methods are used. The LSTM approach has a significant quantity of parameters and is computationally sophisticated.…”
Section: Shallow Bidirectional Gru Network Designmentioning
confidence: 99%
“…In addition, there is a problem of gradient disappearance during backpropagation, mainly because the value of the gradient update neural network weight changes little, resulting in an inability to learn more text series information. To eliminate this issue, the long short-term memory (LSTM) [27] and GRU [28] text classification methods are used. The LSTM approach has a significant quantity of parameters and is computationally sophisticated.…”
Section: Shallow Bidirectional Gru Network Designmentioning
confidence: 99%
“…In sequence prediction problems, classifiers take an input instance and assign it to one of the available classes. Long short-term memory (LSTM) is a type of recurrent neural network (RNN) that has found extensive use in applications such as speech recognition [ 31 ], text classification [ 32 ], and ECG biometrics [ 33 ]. In this field, a hybrid model called CNN-LSTM is utilized.…”
Section: Related Workmentioning
confidence: 99%
“…The hidden layer contains one or more neurons capable of storing long term memory, and each neuron determines the input, output or forget constant error conveyor (CEC) through a gate function. Long short-term memory (LSTM) networks have been widely used to solve text classification problems (Singh et al, 2022).…”
Section: Neural Networkmentioning
confidence: 99%