2020 4th International Conference on Intelligent Computing and Control Systems (ICICCS) 2020
DOI: 10.1109/iciccs48265.2020.9120998
|View full text |Cite
|
Sign up to set email alerts
|

Abstractive Text Summarization on Google Search Results

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 5 publications
0
2
0
Order By: Relevance
“…They built dictionary for word embedding, abstractive outline created utilizing the sequence-to-sequence model which utilization Long Short-Term Memory (LSTM) encoding decoding for preparation those model to process the result. Three stacked LSTM may be assembled for that encoder [8].…”
Section: Related Workmentioning
confidence: 99%
“…They built dictionary for word embedding, abstractive outline created utilizing the sequence-to-sequence model which utilization Long Short-Term Memory (LSTM) encoding decoding for preparation those model to process the result. Three stacked LSTM may be assembled for that encoder [8].…”
Section: Related Workmentioning
confidence: 99%
“…Due to its sequence-to-sequence nature, summarizing video, audio, pictures, and text is a difficult task [3], [4]. Deep learning methods have recently proven to be useful in a variety of domains, including image classification, summarization, machine translation, discourse identification, and text-to-speech production [5]- [8].…”
Section: Introductionmentioning
confidence: 99%
“…Several studies using Long Short-Term Memory (LSTM) to summarize documents have been conducted, including either extractive [15], [16] or abstractive summarization [17]- [19], which have proven the performance of LSTM in text summarization. In the text summarization of court decision documents, several methods such as LSA [20], and the merging of several methods such as LSA, LUHN, LEXRANK, and SUMBASIC [21], were employed.…”
Section: Introductionmentioning
confidence: 99%