Proceedings of the 2nd Workshop on New Frontiers in Summarization 2019
DOI: 10.18653/v1/d19-5404
|View full text |Cite
|
Sign up to set email alerts
|

Learning to Create Sentence Semantic Relation Graphs for Multi-Document Summarization

Abstract: Linking facts across documents is a challenging task, as the language used to express the same information in a sentence can vary significantly, which complicates the task of multidocument summarization. Consequently, existing approaches heavily rely on hand-crafted features, which are domain-dependent and hard to craft, or additional annotated data, which is costly to gather. To overcome these limitations, we present a novel method, which makes use of two types of sentence embeddings : universal embeddings, w… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2020
2020
2025
2025

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(10 citation statements)
references
References 30 publications
0
10
0
Order By: Relevance
“…LexRank) [8], our model performed better in the Rouge score. The state-of-the-art graph-based approach SemSentSum [34] is a fully datadriven model that uses cross-entropy as the objective function. As expected, it outperformed other traditional baselines in Rouge-2 and Rouge-L, but our model still performed better.…”
Section: E Qualitative Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…LexRank) [8], our model performed better in the Rouge score. The state-of-the-art graph-based approach SemSentSum [34] is a fully datadriven model that uses cross-entropy as the objective function. As expected, it outperformed other traditional baselines in Rouge-2 and Rouge-L, but our model still performed better.…”
Section: E Qualitative Resultsmentioning
confidence: 99%
“…Previous works [34] [13] use cross-entropy loss for training. When we trained our model with cross-entropy, the loss tends to output scores close to 0 or 1 which may cause an obstacle for ranking.…”
Section: ) Trainingmentioning
confidence: 99%
See 1 more Smart Citation
“…Hierarchical solutions. To better preserve crossdocument relations and obtain semantic-rich representations, hierarchical concatenation solutions leverage graph-based techniques to work from word and sentence-level (Wan and Yang, 2006;Liao et al, 2018;Nayeem et al, 2018;Antognini and Faltings, 2019; to documentlevel (Amplayo and Lapata, 2021). Other hierarchical approaches include multi-head pooling and inter-paragraph attention architectures (Liu and Lapata, 2019a), attention models with maximal marginal relevance (Fabbri et al, 2019), and attention across different granularity representations (Jin et al, 2020).…”
Section: Related Workmentioning
confidence: 99%
“…Goo and Chen (2018) Graph-to-Sequence Generation Recent research efforts for text generation consider utilizing Graph Neural Networks (GNN) to better model structured data, such as AMR (Beck et al, 2018;Ribeiro et al, 2019), SQL (Xu et al, 2018), and knowledge graph (Koncel-Kedziorski et al, 2019). Additionally, there are many works employed GNN in non-structural scenarios, such as summarization (Yasunaga et al, 2017;Fernandes et al, 2018;Antognini and Faltings, 2019) and comment generation (Li et al, 2019b), by transforming the input into a meaningful graph. We propose the discourse graph to facilitate information flow over the graph.…”
Section: Related Workmentioning
confidence: 99%