2019
DOI: 10.1007/978-3-030-16841-4_25
|View full text |Cite
|
Sign up to set email alerts
|

Deep Tree Transductions - A Short Survey

Abstract: The paper surveys recent extensions of the Long-Short Term Memory networks to handle tree structures from the perspective of learning non-trivial forms of isomorph structured transductions. It provides a discussion of modern TreeLSTM models, showing the effect of the bias induced by the direction of tree processing. An empirical analysis is performed on real-world benchmarks, highlighting how there is no single model adequate to effectively approach all transduction problems.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
3
2
1

Relationship

3
3

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 17 publications
0
4
0
Order By: Relevance
“…We believe such criticisms should be positively embraced by the community to pursue the growth of the field. Some attempts to provide a set of standardized data and methods appear now under development 1 . In relation to this challenge, recent progresses have been facilitated by the growth and wide adoption by the community of modern software packages for the adaptive processing of graphs.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…We believe such criticisms should be positively embraced by the community to pursue the growth of the field. Some attempts to provide a set of standardized data and methods appear now under development 1 . In relation to this challenge, recent progresses have been facilitated by the growth and wide adoption by the community of modern software packages for the adaptive processing of graphs.…”
Section: Discussionmentioning
confidence: 99%
“…Another interesting application field leverages graph learning methods for Natural Language Processing (NLP) tasks, where the input is usually represented as a sequence of tokens. By means of dependency parsers, we can augment the input as a tree [1] or as a graph and learn a model that takes into account the syntactic [71] and semantic [70] relations between tokens in the text. An example is neural machine translation, which can be formulated as a graph-to-sequence problem [8] to consider syntactic dependencies in the source and target sentence.…”
Section: Natural Language Processingmentioning
confidence: 99%
“…Lately, attention is shifting towards the generative use of such learned distribution. For example, Deep Learning models have been used with success to generate parse trees [20,21], intermediate representations of reasoning tasks [22], and more broadly for general structure-to-structure transduction [23]. One domain where deep generative models for graphs are used extensively is Cheminformatics.…”
Section: Related Workmentioning
confidence: 99%
“…Both challenges have been tackled over the years by a variety of approaches ranging from early works on recursive neural networks [11], to their more recent deep learning style re-factoring [17] (see [2] for a recent survey), and including kernel-based approaches [1], [12], [16] and generative models [5], [6], [8]. In this work, we focus in particular on the latter models, which are referred to as Hidden Tree Markov Models (HTMM), being obtained as a generalisation of the Hidden Markov Model (HMM) to the tree domain.…”
Section: Introductionmentioning
confidence: 99%