2016
DOI: 10.1109/taslp.2016.2520371
|View full text |Cite
|
Sign up to set email alerts
|

Deep Sentence Embedding Using Long Short-Term Memory Networks: Analysis and Application to Information Retrieval

Abstract: Abstract-This paper develops a model that addresses sentence embedding, a hot topic in current natural language processing research, using recurrent neural networks (RNN) with Long Short-Term Memory (LSTM) cells. The proposed LSTM-RNN model sequentially takes each word in a sentence, extracts its information, and embeds it into a semantic vector. Due to its ability to capture long term memory, the LSTM-RNN accumulates increasingly richer information as it goes through the sentence, and when it reaches the last… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
405
0
1

Year Published

2017
2017
2023
2023

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 709 publications
(407 citation statements)
references
References 34 publications
1
405
0
1
Order By: Relevance
“…To solve the problem, Hochreiter first proposed Long Short-Term Memory (LSTM) to learn the representation of the data with long distance [18]. Currently, as a popular neural network model, LSTM-RNN has been verified to be an effective or even the state-of-the-art method in many NLP tasks [19][20][21][22][23].…”
Section: Baseline Methodsmentioning
confidence: 99%
“…To solve the problem, Hochreiter first proposed Long Short-Term Memory (LSTM) to learn the representation of the data with long distance [18]. Currently, as a popular neural network model, LSTM-RNN has been verified to be an effective or even the state-of-the-art method in many NLP tasks [19][20][21][22][23].…”
Section: Baseline Methodsmentioning
confidence: 99%
“…Hints are professional independently and as a end result contextual records is out of place. Hamid Palangi, et al [4], The automatic key-word detection and difficulty depend allocation competencies enabled via manner of manner of manner of the LSTM-RNN permit the network to perform record retrieval, a hard language processing challenge, wherein the similarity some of the query and documents can be measured via the distance among their corresponding sentence embedding vectors is computed and calculated with the beneficial aid of the LSTM-RNN. Hao Wang, et al [5], For better-diploma, probabilistic graphical models using their Bayesian sequence nature are despite the fact that that more bendy.…”
Section: Literature Surveymentioning
confidence: 99%
“…In 1997, Hochreiter and Schimidhuber proposed long and short-term memory (LSTM).LSTM redesigned the RNN memory module, LSTM unit consists of memory cell and a number of gates (Palangi et al, 2015). LSTM uses the state of the memory cell to hold the history information.…”
Section: Construction Of Base Classifier Using Lstmmentioning
confidence: 99%