2020
DOI: 10.1016/j.procs.2020.02.234
|View full text |Cite
|
Sign up to set email alerts
|

Application of Long-Short Memory Neural Networks in Semantic Search Engines Development

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 5 publications
0
1
0
Order By: Relevance
“…In ANN, it is assumed that the output only depends on the current input, which is not true in the real world (Gu et al, 2019). LSTM allows us to infer the potential relationships among the content of the context, because it is a recurrent neural network (RNN)-the output depends on the current input and memory (Klimov et al, 2020;Huang et al, 2021;Li et al, 2022;Gorgij et al, 2023). The basic idea of RNN is to build a hidden state for acquiring the information at the previous time point and the global parameters are calculated from the current time and all previous memories.…”
Section: The Learning Processesmentioning
confidence: 99%
“…In ANN, it is assumed that the output only depends on the current input, which is not true in the real world (Gu et al, 2019). LSTM allows us to infer the potential relationships among the content of the context, because it is a recurrent neural network (RNN)-the output depends on the current input and memory (Klimov et al, 2020;Huang et al, 2021;Li et al, 2022;Gorgij et al, 2023). The basic idea of RNN is to build a hidden state for acquiring the information at the previous time point and the global parameters are calculated from the current time and all previous memories.…”
Section: The Learning Processesmentioning
confidence: 99%