Proceedings of the 22nd Conference on Computational Natural Language Learning 2018
DOI: 10.18653/v1/k18-1023
|View full text |Cite
|
Sign up to set email alerts
|

A Temporally Sensitive Submodularity Framework for Timeline Summarization

Abstract: Timeline summarization (TLS) creates an overview of long-running events via dated daily summaries for the most important dates. TLS differs from standard multi-document summarization (MDS) in the importance of date selection, interdependencies between summaries of different dates and by having very short summaries compared to the number of corpus documents. However, we show that MDS optimization models using submodular functions can be adapted to yield wellperforming TLS models by designing objective functions… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
82
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 30 publications
(82 citation statements)
references
References 16 publications
0
82
0
Order By: Relevance
“…1. We introduce the first abstractive system for TLS and show that it outperforms current ex-tractive TLS systems such as Martschat and Markert (2018) when the input corpora are large with low compression rate. 1 2.…”
Section: -10-02mentioning
confidence: 99%
See 4 more Smart Citations
“…1. We introduce the first abstractive system for TLS and show that it outperforms current ex-tractive TLS systems such as Martschat and Markert (2018) when the input corpora are large with low compression rate. 1 2.…”
Section: -10-02mentioning
confidence: 99%
“…While both TLS and Multi-Document Summarization (MDS) generate summaries from multiple input documents, there are substantial differences between the two tasks. Specifically, Martschat and Markert (2018) Figure 1: A graphical overview of our system. We can see that not all clusters are included in the timeline.…”
Section: Differences To Mdsmentioning
confidence: 99%
See 3 more Smart Citations