Findings of the Association for Computational Linguistics: ACL 2022 2022
DOI: 10.18653/v1/2022.findings-acl.258
|View full text |Cite
|
Sign up to set email alerts
|

Word-level Perturbation Considering Word Length and Compositional Subwords

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…In this study, we proposed the inference strategy to mitigate the discrepancy between the training and inference in subword regularizations. In experiments, we focused the subword regularization proposed by Kudo (2018) but we can apply the proposed inference strategy to variants of the subword regularization such as BPE dropout (Provilkov et al, 2020) and compositional word replacement (Hiraoka et al, 2022). Takase and Kiyono (2021) reported that simple perturbations such as word dropout are effective in a large amount of training data.…”
Section: Related Workmentioning
confidence: 99%
“…In this study, we proposed the inference strategy to mitigate the discrepancy between the training and inference in subword regularizations. In experiments, we focused the subword regularization proposed by Kudo (2018) but we can apply the proposed inference strategy to variants of the subword regularization such as BPE dropout (Provilkov et al, 2020) and compositional word replacement (Hiraoka et al, 2022). Takase and Kiyono (2021) reported that simple perturbations such as word dropout are effective in a large amount of training data.…”
Section: Related Workmentioning
confidence: 99%