2020
DOI: 10.48550/arxiv.2002.03912
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Probabilistic Formulation of Unsupervised Text Style Transfer

Junxian He,
Xinyi Wang,
Graham Neubig
et al.

Abstract: We present a deep generative model for unsupervised text style transfer that unifies previously proposed non-generative techniques. Our probabilistic approach models non-parallel data from two domains as a partially observed parallel corpus. By hypothesizing a parallel latent sequence that generates each observed sequence, our model learns to transform sequences from one domain to another in a completely unsupervised fashion. In contrast with traditional generative sequence models (e.g. the HMM), our model mak… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 14 publications
(21 citation statements)
references
References 20 publications
0
21
0
Order By: Relevance
“…Training is accomplished using an approach from (He et al, 2020): We employ seq2seq inference networks and use an amortized inference scheme similar to that used in a conventional VAE, but for sequential discrete latents. Ideally, learning should directly optimize the log data likelihood, which is the marginal shown in Eq.…”
Section: Learning and Inferencementioning
confidence: 99%
See 1 more Smart Citation
“…Training is accomplished using an approach from (He et al, 2020): We employ seq2seq inference networks and use an amortized inference scheme similar to that used in a conventional VAE, but for sequential discrete latents. Ideally, learning should directly optimize the log data likelihood, which is the marginal shown in Eq.…”
Section: Learning and Inferencementioning
confidence: 99%
“…Fundamentally, our framework is concerned with stylistic features of human-generated text. Thus, a large body of prior work on methods for unsupervised style transfer are related to our approach (Santos et al, 2018;Yang et al, 2018;Luo et al, 2019;He et al, 2020). There is also a vast body of work on style obfuscation (Emmery et al, 2018;Reddy and Knight, 2016;Bevendorff et al, 2019;Shetty et al, 2018).…”
Section: Evaluation With Human Judgmentsmentioning
confidence: 99%
“…We now discuss how to find the most suitable value of each balancing variable. We employ the variational inference framework from probabilistic MAML (He et al, 2020) and TAML to extract the task-specific information. The variational inference framework is used to compute posterior distributions for the balancing variables z τ , γ τ , ω τ .…”
Section: Style Transfer Modelmentioning
confidence: 99%
“…If parallel training data is available, a wide range of supervised techniques in machine translation (e.g., Seq2Seq models (Bahdanau et al, 2014) and Transformers (Vaswani et al, 2017)) can also be applied to style transfer problems. For non-parallel data, He et al (2020) proposed a probabilistic formulation that models non-parallel data from two domains as a partially observed parallel corpus, and learn the style transfer model in a completely unsupervised fashion. Unsupervised machine translation method has also been adapted to this setting (Zhang et al, 2018).…”
Section: Introductionmentioning
confidence: 99%
“…Recent work has explored a variety text generation tasks that condition on a control variable to specify a desired trait of the output. Examples include summarization conditioned on a desired output length (Fan et al, 2018), paraphrase generation conditioned on a parse tree (Krishna et al, 2020), style transfer conditioned on sentiment (He et al, 2020b), and more (Hu et al, 2017;Li et al, 2018;Fu et al, 2018;He et al, 2020a). In this work, we specifically focus on text generation tasks that condition on a scalar control variable, as depicted in Figure 1.…”
Section: Introductionmentioning
confidence: 99%