Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Confere 2015
DOI: 10.3115/v1/p15-1150
|View full text |Cite
|
Sign up to set email alerts
|

Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks

Abstract: Because of their superior ability to preserve sequence information over time, Long Short-Term Memory (LSTM) networks, a type of recurrent neural network with a more complex computational unit, have obtained strong results on a variety of sequence modeling tasks. The only underlying LSTM structure that has been explored so far is a linear chain. However, natural language exhibits syntactic properties that would naturally combine words to phrases. We introduce the Tree-LSTM, a generalization of LSTMs to tree-str… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

13
2,042
1
7

Year Published

2015
2015
2022
2022

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 2,371 publications
(2,063 citation statements)
references
References 31 publications
13
2,042
1
7
Order By: Relevance
“…Δεδομένου του ανεξερεύνητου χώρου έρευνας γύρω από τα νευρωνικά δίκτυα, μπορούμε να δοκιμάσουμε πιο σύνθετες δομές εκμάθησης και να δίνουμε με διαφορετικό τρόπο την πρόσθετη πληροφορία για τον κώδικα. Μια πρώτη τέτοια ιδέα είναι η χρήση των tree-LSTM δικτύων [20] για την διαχείριση των abstract syntax trees της γλώσσας που χρησιμοποιείται.…”
Section: μελλοντική εργασίαunclassified
“…Δεδομένου του ανεξερεύνητου χώρου έρευνας γύρω από τα νευρωνικά δίκτυα, μπορούμε να δοκιμάσουμε πιο σύνθετες δομές εκμάθησης και να δίνουμε με διαφορετικό τρόπο την πρόσθετη πληροφορία για τον κώδικα. Μια πρώτη τέτοια ιδέα είναι η χρήση των tree-LSTM δικτύων [20] για την διαχείριση των abstract syntax trees της γλώσσας που χρησιμοποιείται.…”
Section: μελλοντική εργασίαunclassified
“…These models usually build up the sentence representation directly from the lexical surface representation and rely on the pooling layer to capture the dependencies between words. Another popular method for continuous sentence representation is based on the recursive neural network (Socher et al, 2012;Socher et al, 2013;Tai et al, 2015). These models use a tree structure to compose a continuous sentence representation and have the advantages of capturing more fine-grained sentential structure due to the use of parsing trees.…”
Section: Related Workmentioning
confidence: 99%
“…In natural language processing tasks, LSTM has been used for text data related tasks such as sentiment analysis and semantic representation [7], [8]. There are several variations of LSTM exist, but in this paper, we use the following LSTM notation and formulations as used in [9]:…”
Section: Long Short-term Memorymentioning
confidence: 99%