2022
DOI: 10.1155/2022/6241373
|View full text |Cite|
|
Sign up to set email alerts
|

N-GPETS: Neural Attention Graph-Based Pretrained Statistical Model for Extractive Text Summarization

Abstract: The extractive summarization approach involves selecting the source document’s salient sentences to build a summary. One of the most important aspects of extractive summarization is learning and modelling cross-sentence associations. Inspired by the popularity of Transformer-based Bidirectional Encoder Representations (BERT) pretrained linguistic model and graph attention network (GAT) having a sophisticated network that captures intersentence associations, this research work proposes a novel neural model N-GP… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 50 publications
0
3
0
Order By: Relevance
“…According to [4], extractive text summarization is the approach of selecting salient sentences from a source document to create a summary. They propose a new neural model called N-GPETS for extractive text summarization.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…According to [4], extractive text summarization is the approach of selecting salient sentences from a source document to create a summary. They propose a new neural model called N-GPETS for extractive text summarization.…”
Section: Related Workmentioning
confidence: 99%
“…Data Set Metrics Result/Observation N-GPETS [19] Extractive ----Favorable results obtained Transformers [7] Table 1: Summary table of selected information…”
Section: Model Summary Typementioning
confidence: 99%
“…This article has been retracted by Hindawi, as publisher, following an investigation undertaken by the publisher [ 1 ]. This investigation has uncovered evidence of systematic manipulation of the publication and peer-review process.…”
mentioning
confidence: 99%