2021
DOI: 10.48550/arxiv.2109.01982
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Learning Hierarchical Structures with Differentiable Nondeterministic Stacks

Abstract: Learning hierarchical structures in sequential data -from simple algorithmic patterns to natural language -in a reliable, generalizable way remains a challenging problem for neural language models. Past work has shown that recurrent neural networks (RNNs) struggle to generalize on held-out algorithmic or syntactic patterns without supervision or some inductive bias. To remedy this, many papers have explored augmenting RNNs with various differentiable stacks, by analogy with finite automata and pushdown automat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 13 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?