2019
DOI: 10.1016/j.csl.2019.04.006
|View full text |Cite
|
Sign up to set email alerts
|

Neural sentence fusion for diversity driven abstractive multi-document summarization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(3 citation statements)
references
References 7 publications
0
3
0
Order By: Relevance
“…These property inconsistencies will increase the difficulty of network learning. Confronting the nonuniform properties of different modules, the fusion analysis cannot be performed by simply listing them together, while some advanced fusion techniques are required [ 21 ]. The data belonging to each module has its own property, then a network weighted parameter is generated to adjust the input information of different modules.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…These property inconsistencies will increase the difficulty of network learning. Confronting the nonuniform properties of different modules, the fusion analysis cannot be performed by simply listing them together, while some advanced fusion techniques are required [ 21 ]. The data belonging to each module has its own property, then a network weighted parameter is generated to adjust the input information of different modules.…”
Section: Introductionmentioning
confidence: 99%
“…ese property inconsistencies will increase the difficulty of network learning. Confronting the nonuniform properties of different modules, the fusion analysis cannot be performed by simply listing them together, while some advanced fusion techniques are required [21].…”
Section: Introductionmentioning
confidence: 99%
“…Transformer, as base language models, has significantly impacted the NLP research field to replace the deficiency of both LSTM, CNN and RNN based as a deep learning architecture [12,13], so that many reasons why the transformer was chosen as base model architecture. Various studies applied to transformer architecture have been carried out and have improved results significantly in document summarization [14].…”
Section: Introductionmentioning
confidence: 99%