2023 IEEE International Conference on Big Data and Smart Computing (BigComp) 2023
DOI: 10.1109/bigcomp57234.2023.00098
|View full text |Cite
|
Sign up to set email alerts
|

Multilayer CARU Model for Text Summarization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 21 publications
0
2
0
Order By: Relevance
“…The process of comparing the text and creating the outline is made more difficult by the use of pre-trained models that are based on transformer architecture. The findings acquired from the machine learning models are evaluated and compared with the help of the BBC news dataset for this study investigation [29]. They have completed an analysis of several methods for automatically summarising text.…”
Section: Computational Intelligence and Machine Learning E-issn: 2582...mentioning
confidence: 99%
“…The process of comparing the text and creating the outline is made more difficult by the use of pre-trained models that are based on transformer architecture. The findings acquired from the machine learning models are evaluated and compared with the help of the BBC news dataset for this study investigation [29]. They have completed an analysis of several methods for automatically summarising text.…”
Section: Computational Intelligence and Machine Learning E-issn: 2582...mentioning
confidence: 99%
“…The process of comparing the text and creating the outline is made more difficult by the use of pre-trained models that are based on transformer architecture. The findings acquired from the machine learning models are evaluated and compared with the help of the BBC news dataset for this study investigation [29]. They have completed an analysis of several methods for automatically summarising text.…”
Section: Computational Intelligence and Machine Learning E-issn: 2582...mentioning
confidence: 99%