2021
DOI: 10.1162/tacl_a_00366
|View full text |Cite
|
Sign up to set email alerts
|

Extractive Opinion Summarization in Quantized Transformer Spaces

Abstract: We present the Quantized Transformer (QT), an unsupervised system for extractive opinion summarization. QT is inspired by Vector- Quantized Variational Autoencoders, which we repurpose for popularity-driven summarization. It uses a clustering interpretation of the quantized space and a novel extraction algorithm to discover popular opinions among hundreds of reviews, a significant step towards opinion summarization of practical scope. In addition, QT enables controllable summarization without further training,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
26
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 48 publications
(32 citation statements)
references
References 27 publications
0
26
0
Order By: Relevance
“…Recently, specifically for opinionated texts, several abstractive multi-document summarization methods have been developed, such as MeanSum, Copycat, and DenoiseSum, as explained in Section 4.3. Concurrently with our work, Angelidis et al (2021) use quantitized transformers enabling aspect-based extractive summarization, and Amplayo et al (2020) incorporate the aspect and sentiment distributions into the unsupervised abstractive summarization. Our method incorporates topic-tree structure into unsupervised abstractive summarization and generates summaries consisting of multiple granularities of topics.…”
Section: Unsupervised Summary Generationmentioning
confidence: 94%
“…Recently, specifically for opinionated texts, several abstractive multi-document summarization methods have been developed, such as MeanSum, Copycat, and DenoiseSum, as explained in Section 4.3. Concurrently with our work, Angelidis et al (2021) use quantitized transformers enabling aspect-based extractive summarization, and Amplayo et al (2020) incorporate the aspect and sentiment distributions into the unsupervised abstractive summarization. Our method incorporates topic-tree structure into unsupervised abstractive summarization and generates summaries consisting of multiple granularities of topics.…”
Section: Unsupervised Summary Generationmentioning
confidence: 94%
“…The problem could be exacerbated by ungrammatical spoken utterances and transcription errors. Instead, we consider VQ-VAE, an unsupervised representation learning technique (van den Oord et al, 2017;Jin et al, 2020;Angelidis et al, 2021) for content extraction. Unsupervised training of the VQ-VAE model and its inference could potentially be performed at the same time, allowing important utterances to be extracted from a transcript segment on-the-fly during streaming, without interrupting the learning process.…”
Section: Related Workmentioning
confidence: 99%
“…The method was explored for opinion summarization (Angelidis et al, 2021) and machine translation (Prato et al, 2020). We are interested in using the method to account for domain characteristics of livestreams, which showcase new and creative work of artists and designers on their use of Photoshop, Illustrator, and other tools.…”
Section: Summarizationmentioning
confidence: 99%
See 2 more Smart Citations