2022
DOI: 10.48550/arxiv.2204.02633
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

DAGAM: Data Augmentation with Generation And Modification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…Wu et al [22] proposed a text smoothing method that uses BERT to encode one-hot representations into smoothed and interpolated representations for data augmentation. Jo et al [23] proposed the DAGAM method, which uses three sentences with the same label as input and employs the T5 generation model [24] to generate augmented sentences. Liu et al [25] proposed the SRAFBN model, which uses a self-attention mechanism to extract the key information in images so that the generated images can be of a higher quality.…”
Section: Data Augmentationmentioning
confidence: 99%
“…Wu et al [22] proposed a text smoothing method that uses BERT to encode one-hot representations into smoothed and interpolated representations for data augmentation. Jo et al [23] proposed the DAGAM method, which uses three sentences with the same label as input and employs the T5 generation model [24] to generate augmented sentences. Liu et al [25] proposed the SRAFBN model, which uses a self-attention mechanism to extract the key information in images so that the generated images can be of a higher quality.…”
Section: Data Augmentationmentioning
confidence: 99%