Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics 2020
DOI: 10.18653/v1/2020.acl-main.513
|View full text |Cite
|
Sign up to set email alerts
|

OpinionDigest: A Simple Framework for Opinion Summarization

Abstract: We present OPINIONDIGEST, an abstractive opinion summarization framework, which does not rely on gold-standard summaries for training. The framework uses an Aspect-based Sentiment Analysis model to extract opinion phrases from reviews, and trains a Transformer model to reconstruct the original reviews from these extractions. At summarization time, we merge extractions from multiple reviews and select the most popular ones. The selected opinions are used as input to the trained Transformer model, which verbaliz… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
38
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 54 publications
(38 citation statements)
references
References 16 publications
(16 reference statements)
0
38
0
Order By: Relevance
“…More recently, (Tan et al, 2017) suggested a novel generative topic aspect sentiment model, while ) suggested a novel system able to extract both general and aspect-specific summaries. As for abstractive summarization, recent advances on pre-training neural networks were explored in the context of product reviews in unsupervised and few-shot learning schemes which led to promising results (Chu and Liu, 2019;Brazinskas et al, 2020b,a;Suhara et al, 2020;.…”
Section: Related Workmentioning
confidence: 99%
“…More recently, (Tan et al, 2017) suggested a novel generative topic aspect sentiment model, while ) suggested a novel system able to extract both general and aspect-specific summaries. As for abstractive summarization, recent advances on pre-training neural networks were explored in the context of product reviews in unsupervised and few-shot learning schemes which led to promising results (Chu and Liu, 2019;Brazinskas et al, 2020b,a;Suhara et al, 2020;.…”
Section: Related Workmentioning
confidence: 99%
“…Recently, several abstractive neural summarization methods have shown promising results. These models require no summaries for training (Chu and Liu, 2019;Bražinskas et al, 2020b;Suhara et al, 2020), or only a handful of them (Bražinskas et al, 2020a). As discussed in the previous section, textual summaries provide more detail than aspect-based sentiment summaries, but lack a quantitative dimension.…”
Section: Related Workmentioning
confidence: 99%
“…A concurrent model DENOISESUM (Amplayo and Lapata, 2020) uses a syntactically generated dataset of source reviews to train a generator to denoise and distill common information. Another parallel work, OPINIONDIGEST (Suhara et al, 2020), considers controllable opinion aggregation and is a pipeline framework for abstractive summary generation. Our conditioning on text properties approach is similar to Ficler and Goldberg ( 2017), yet we rely on automatically derived properties that associate a target to source, and learn a separate module to generate their combinations.…”
Section: Related Workmentioning
confidence: 99%