2019
DOI: 10.48550/arxiv.1906.01634
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

On the Realization of Compositionality in Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…These generalization tasks pose particular challenges, as they require generating OOD output sequences, composed of novel combinations of in-distribution output subsequences, from in-distribution input sequences. Whereas other approaches have focused on modifying the training objective (Lake, 2019;Baan et al, 2019), model architecture (Kuo et al, 2020;Gao et al, 2020), or data augmentation (Andreas, 2019;Akyürek et al, 2020), we use RD to reformulate and decompose a complex sequence generation task into a series of smaller predictions that appear in-distribution from the perspective of the model.…”
Section: Discussionmentioning
confidence: 99%
“…These generalization tasks pose particular challenges, as they require generating OOD output sequences, composed of novel combinations of in-distribution output subsequences, from in-distribution input sequences. Whereas other approaches have focused on modifying the training objective (Lake, 2019;Baan et al, 2019), model architecture (Kuo et al, 2020;Gao et al, 2020), or data augmentation (Andreas, 2019;Akyürek et al, 2020), we use RD to reformulate and decompose a complex sequence generation task into a series of smaller predictions that appear in-distribution from the perspective of the model.…”
Section: Discussionmentioning
confidence: 99%