Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing 2018
DOI: 10.18653/v1/d18-1444
|View full text |Cite
|
Sign up to set email alerts
|

Controlling Length in Abstractive Summarization Using a Convolutional Neural Network

Abstract: Convolutional neural networks (CNNs) have met great success in abstractive summarization, but they cannot effectively generate summaries of desired lengths. Because generated summaries are used in difference scenarios which may have space or length constraints, the ability to control the summary length in abstractive summarization is an important problem. In this paper, we propose an approach to constrain the summary length by extending a convolutional sequence to sequence model. The results show that this app… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
58
0
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 50 publications
(59 citation statements)
references
References 9 publications
0
58
0
1
Order By: Relevance
“…Other hyperparameters of models and optimization methods used in our experiments are summarized in Table 1. We halve the word embedding size, hidden state size, and the number of layers of LC from the original setting of Liu et al (2018). This is because avoiding out-of-memory error on our GPU when applying MRT, and GOLC, and our objective of the experiments with LC is the evaluation of length control ability of each optimization method.…”
Section: Optimization Methods To Be Comparedmentioning
confidence: 99%
See 3 more Smart Citations
“…Other hyperparameters of models and optimization methods used in our experiments are summarized in Table 1. We halve the word embedding size, hidden state size, and the number of layers of LC from the original setting of Liu et al (2018). This is because avoiding out-of-memory error on our GPU when applying MRT, and GOLC, and our objective of the experiments with LC is the evaluation of length control ability of each optimization method.…”
Section: Optimization Methods To Be Comparedmentioning
confidence: 99%
“…Other methods for the improvement of abstractive summarization models include use of existing summaries as soft templates with a source text and extraction of actual fact descriptions from a source text . Although summary length control of abstractive summarization has been studied, previous studies focus on incorporation of a length controlling method to neural abstractive summarization models (Kikuchi et al, 2016;Fan et al, 2018;Liu et al, 2018;Fevry and Phang, 2018;Schumann, 2018). In contrast, our research focuses on a global optimization method.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…LenInit initializes the LSTM cell of the decoder with the embedding depending on the scalar value of the desired length. Liu et al (2018) incorporated such scalar values into the initial state of the decoder in a CNN encoder-decoder. These approaches deal with any length but it is reasonable to incorporate the distance to the desired terminal position into each decoding step such as in LenEmb.…”
Section: Introductionmentioning
confidence: 99%