2017
DOI: 10.1007/978-3-319-73165-0_18
|View full text |Cite
|
Sign up to set email alerts
|

Bidirectional LSTM Recurrent Neural Network for Keyphrase Extraction

Abstract: To achieve state-of-the-art performance, keyphrase extraction systems rely on domain-specific knowledge and sophisticated features. In this paper, we propose a neural network architecture based on a Bidirectional Long Short-Term Memory Recurrent Neural Network that is able to detect the main topics on the input documents without the need of defining new hand-crafted features. A preliminary experimental evaluation on the well-known INSPEC dataset confirms the e↵ectiveness of the proposed solution.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0
1

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 55 publications
(27 citation statements)
references
References 22 publications
0
26
0
1
Order By: Relevance
“…Finally, Basaldella, Antolli, Serra, and Tasso (2018) propose a Bi-LSTM RNN which is able to exploit previous and future context of a given word. First, the document is split into sentences that are tokenized in words.…”
Section: Deep Learning Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Finally, Basaldella, Antolli, Serra, and Tasso (2018) propose a Bi-LSTM RNN which is able to exploit previous and future context of a given word. First, the document is split into sentences that are tokenized in words.…”
Section: Deep Learning Methodsmentioning
confidence: 99%
“…Recently, features based on pretrained word embeddings are widely used. Particularly, deep learning methods (Alzaidy et al, 2019;Basaldella et al, 2018;Wang et al, 2018;Zhang et al, 2016) use such embeddings as input. Additionally, GloVe pretrained word embeddings are used along with an Idf-weighted scheme for each phrase representation by Wang and Li (2017) (PCU-ICL).…”
Section: External Knowledgementioning
confidence: 99%
“…From the short text summarization and analysis perspective, it is no doubt in saying that a particular number of neural network varieties are experimented with, and proven over time. To name a few, recurrent neural networks with long-short term memory (RNN-LSTM) [27], bi-directional long-short term memory (Bi-LSTM) [28], and the convolutional neural networks (CNN) [29]. While the problem with classic rulebased machine learning algorithms is, though they are faster to learn [30], they are shallow, hence the learning curve never really takes off with the time flow.…”
Section: Related Workmentioning
confidence: 99%
“…Generally, the output of the two networks is combined by summation [35]. BLSTM networks are the extension of LSTMs, which can improve the model performance on sequence classification problems [36][37][38]. In this paper, we both used RNN, LSTM, BRNN, and BLSTM to classify the protein families.…”
Section: Deep Learning Modelsmentioning
confidence: 99%